Skip to main content

2025 | Buch

Proceedings of Third International Conference on Computing and Communication Networks

ICCCN 2023, Volume 2

herausgegeben von: Giancarlo Fortino, Akshi Kumar, Abhishek Swaroop, Pancham Shukla

Verlag: Springer Nature Singapore

Buchreihe : Lecture Notes in Networks and Systems

insite
SUCHEN

Über dieses Buch

This book includes selected peer-reviewed papers presented at third International Conference on Computing and Communication Networks (ICCCN 2023), held at Manchester Metropolitan University, UK, during 17–18 November 2023. The book covers topics of network and computing technologies, artificial intelligence and machine learning, security and privacy, communication systems, cyber-physical systems, data analytics, cybersecurity for Industry 4.0, and smart and sustainable environmental systems.

Inhaltsverzeichnis

Frontmatter
Fake Trend Detection in Twitter Using Machine Learning

Social media plays a major part in everyone’s lives in which it aids the people to interact and connect with them. Social media has created large communities to influence what everyone reads, does, or even thinks. The influencing mechanism of Twitter trends is exploited using bots and spam accounts where several tweets on a topic are boosted into trends based in an artificial way. The fake trends are common in the social media domain which is discovered through promoting gambling, information phishing, disinformation campaigns, political slogans, and hate speech. The manipulation of Twitter trends has great implications because Twitter trends get more attention. The broader media channels cover trends through utilizing as a stand-in for what people are talking about. Unfortunately, this stand-in is manipulated, clouding the public’s perception of the topics being discussed. It is important to detect these fake trends and ensure that the public is not exposed to fake trends. The proposed fake trend detection system classifies the Twitter accounts into bot and human accounts. TwiBot-20 benchmark dataset is trained to classify the bot accounts using Random Forest model which achieves 94% accuracy. The metadata features and influence entropies are extracted from bot accounts. The influence entropy is measured by constructing an influence graph using igraph and NetworkX through analyzing the retweet count and user mentions in each tweet of an account. The trending topic of the tweets is analyzed and the authenticity of the trend is categorized using one-dimensional clustering. A web application is developed to facilitate the proposed fake trend detection system for the users without technical knowledge.

Valliyammai Chinnaiah, Manikandan Dhayanithi, Santhosh Patturaj, Ramanujan Ranganathan, Vishnu B. A. Mohan
Secure Blockchain Model for Iomt Smart Mobility System

With technological advancements, electronic health records (EHRs) have replaced paper records to improve patient treatment quality and effectiveness in civil hospitals and military treatment facilities. These records must access by doctors and nurses to help in treating the patients through the Internet. Also, the enormous quantity of medical data for each patient makes it difficult to secure it. These reasons make it possible for medical data to be stolen by hackers. So, organizing and securing patients’ medical data in both civil and military hospitals is one of the main issues in the healthcare field. So, a new blockchain technique is proposed to secure the patient’s medical data. By making it decentralized, this proposed blockchain uses smart mobility technology. A smart Internet of Medical Things (IoMT) mobility system is used to collect patient vital signs and then store them in the blockchain. A modified version of the Nik-512 algorithm is utilized to connect the blocks in the blockchain. This modified version of the Nik-512 uses the LZ4 algorithm to compress data. This work aims to minimize the hash code computation time by preventing the hash code from beginning with zeros. Also, we build an authentication parity technique that checks the identification of the doctor and patient before the transaction happens. Also, in this work, a new master contract system is developed to guarantee transferring the money from the patient’s wallet to the doctor’s wallet after the doctor sends the treatment to the patient. The experimental results prove that the proposed work using Nik-512 is faster than all pervious compared blockchain versions.

Ibrahim Shawky Farahat, Mohamed Elhoseny, Samir Elmougy, Abedallah Zaid Abualkishik, Waleed Aladrousy, Ahmed Elsaid Tolba
A Tamper-Proof Smart Contract Metamodel for Blockchain to Optimise Computational Latency

In recent years, blockchain technology has garnered considerable interest on account of its peer-to-peer, distributed consensus, and anonymity functionalities. A smart contract functions as a verifiable, self-executing, and self-validating computing layer. This study seeks to identify the key concepts and associated future research goals pertaining to smart contracts based on the blockchain. Smart contracts enable the implementation of programmable assets, including money, and the automation of business logic processes that were previously executed manually. In response to this disclosure, a comprehensive examination of smart contracts employing blockchain technology is undertaken. In this article, we propose an innovative and practical metamodel for ensuring the integrity of protocols within smart contracts facilitated by blockchain technology. This article begins with an explanation of smart contracts’ operational models, functionalities, and applications. Analysis of the results reveals that the utilisation of the model alleviates the workload associated with numerous measurements. It was determined that the mean response latency, computation time, and overhead were each 1.47, 1.19, and 3.04 s.

Ratul Sengupta, Ruchika Srivastava, Sushruta Mishra, Laith Abualigah
Correlation Among Learners’ Economic Ability, Attitude Toward ICT, and Reading Performance: An Exploration Among Twenty-First Century Teacher Aspirants

Information and communication technology (ICT) witnessed the prevalence of digital-based reading materials. The fast-paced shift from paper-based learning to technology-based learning paved its way to education embracing ICT in any aspect. Due to the urgent necessity of utilizing ICT in education, several studies have explored learners’ and teachers’ attitudes toward ICT. However, there is a scarcity of research that mainly focuses on learners and features one of the primary skills in gaining knowledge: reading skills. This study aims to determine the correlation among learners’ economic ability, attitude toward ICT, and reading performance. This quantitative study identified 90 aspiring teachers from the College of Teacher Education at Western Mindanao State University with prior experience with online learning during the COVID-19 pandemic who are now slowly adapting to the new normal system of education. Finally, the results of the study reveal that there is a correlation among learners’ economic ability, attitude toward ICT, and reading performance. This signifies that if respondents have access to the actual technology, it will have a direct impact on their attitude toward ICT and will help them improve their reading performance. It should be noted that the findings constructed in this investigation are essential to serve as a hint that learners in today’s generation work better with technology.

Poulyn B. Lozada, Jovannie Sarona, Divine Grace M. Marumas, Nurun-nisa B. Hasan, Fhadzralyn A. Karanain, Ericson O. Alieto
A Review of Reentrancy Attack in Ethereum Smart Contracts

Blockchain technology has catalyzed a revolutionary shift toward decentralized applications, prominently exemplified by Ethereum’s introduction of smart contracts. A smart contract is a self-executing program running on the Ethereum Virtual Machine (EVM), designed to automate and ensure trust in transactions, bypassing traditional intermediaries. Nevertheless, as their adoption proliferates, inherent vulnerabilities come to the fore, thereby highlighting significant security challenges. Notably, Reentrancy attacks, underscored by the 2016 DAO hack that precipitated a staggering loss of approximately $60 million in Ether, stand out as paramount concerns. This paper offers a comprehensive review of Reentrancy attacks targeting Ethereum smart contracts. It elucidates the mechanics underpinning such attacks, pinpointing recurrent patterns and susceptibilities. Concurrently, an exploration of the trajectory of countermeasures and contemporary solutions proposed within the research sphere is undertaken. Through a detailed analysis of both the nature of attacks and the corresponding mitigation strategies, this work emphasizes potential future directions, offering invaluable insights to guide efforts in enhancing the robustness and security of Ethereum’s smart contracts

Salam Al-E’mari, Yousef Sanjalawe
A Comparative Analysis of Population-Based Algorithm for Optimizing Cost and Makespan for Task Scheduling in Cloud–Fog Environment

As cloud computing makes the use of services and various resources easier for the user, there has been a growing need for efficient and quick resource utilization and services to enhance the performance and that is what fog computing comes in. Fog computing is an extension of cloud computing that takes place at the network's edge. Simply described, it is a hybrid combination of cloud computing and the Internet of things. It is a highly virtualized platform that connects end devices to standard cloud servers and delivers computation, storage, and networking services. Various task scheduling algorithms are used to schedule the tasks at fog nodes that increase the performance of the system. Task scheduling algorithms are classified into stochastic, deterministic, and hybrid algorithms. Our work depicts a performance evaluation of three population-based optimization algorithms which are used for workflow scheduling in a combination of cloud and fog environment. The following algorithms were used to perform the comparison: particle swarm optimization (PSO), genetic algorithm (GA), and a hybrid combination of PSO and GA. The evaluation function consists of three parameters: makespan, cost, and energy. Also, the performance of these algorithms is analyzed by varying the cloud nodes and fog nodes by keeping the number of end devices fixed. The simulator used for the comparative evaluation is the recently proposed FogWorkflowSim. The research proves that hybrid algorithm of the PSO-GA algorithm performs better than the traditional PSO and GA algorithms in terms of cost and makespan. In future, performance will be evaluated by increasing the number of tasks and more workflows would be added to evaluate the performance. The algorithms would be evaluated against the deadline for workflows.

Shivam Sharma, Amandeep Verma
Image Integrity Checking Using Watermarking in Cloud Computing: A Review

In today’s era of cloud computing, modification and tampering of digital images on cloud storage have turn out to be easier due to proliferation of digital image processing tools. Consequently, tamper detection and integrity checking of images on remote server emerged as huge concern. Image watermarking techniques provide an efficient way to tackle such problems. This paper provides a systematic examination of existing watermarking methods currently in use in cloud computing environment. This paper also presents a comparative analysis of those techniques.

Jyoti Rani, Rajender Nath
A Blockchain Based Decentralized Application System for Vanet FDIA Detection

The growing prominence of intelligent transportation systems (ITS) has led to increased research on vehicular ad hoc networks (VANETs). Blockchain technology offers promising solutions for ensuring data security, integrity, and immutability within VANETs. However, VANETs remain vulnerable to false data injection attacks (FDIAs), which can significantly disrupt network operations and endanger safety. This paper proposes a novel approach using a blockchain-based decentralized application system (Dapp) for FDIA detection in VANETs. The Dapp leverages smart contracts to manage crucial aspects like data provenance, reputation, and consensus mechanisms. It uses POS consensus algorithm to guarantee data consistency and fault tolerance even in the presence of malicious nodes. To manage the vehicular data, users can access the system via Dapp, an Ethereum—distributed application. By integration of blockchain-based IPFS-Dapps system and AI/ML/big data model with blockchain are aim of the research article for providing the solutions for false data injection attacks (FDIA). We collect the data through Radio Channels, Smart Sensors, Smart Antennas and RSU and than uploaded using Dapp and than we classified nodes from smart contract and finally send it to Ethereum. We used a machine learning SVR algorithm, i.e., vehicular network-based support vector regression (VBSVR) to identify an attacker node. We have written the pseudocode for using IPFS with Ethereum, proposed research methodology includes integration of blockchain and Dapps with smart contract, integration of AI/ML/big data models with blockchain, and integration of ML Oracle under blockchain that are ideal for detection of false injection attacks nodes, and same is logged in IPFS Decentralized Server.

Grover Preeti, Prasad Sanjeev Kumar
A Novel Key Generation Algorithm Utilizing Lattice-Based Cryptography Principles

Technology advancement is inevitable, and it has created a revolution in all the sectors creating huge opportunities. Artificial intelligence and machine learning have boosted the advancement of all the sectors making the life easier and comfortable. On the other hand, these advancements highly depend on the connectivity through which all the terminals are communicating. Thus, all the technology developments depend on the communication paradigm, particularly on the wireless medium. Securing the information and infrastructure is crucial, and it demands highly complex problems that cannot be breached with high computation machines like quantum computer. The emergence of quantum computer has made all the existing cryptographic algorithms vulnerable creating a space for advanced algorithms that are safe from quantum computer attacks. Key generation is the foundational process of any cryptographic algorithm, and in this research, we propose a novel key generation mechanism based on lattice using mathematical functions. The proposal is evaluated using theoretical assumptions, and it is found to be safe and secure.

Rajakumar Arul, B. Sivaselvam, Ankush Choudhary
Cryptocurrencies and Their Economic Implications: Analyse the Impact of Cryptocurrencies Like Bitcoin on Traditional Financial Systems, Including Their Potential to Disrupt Traditional Banking and the Role of Blockchain Technology in Creating Decentralised Economies

As the world of information and communication expands at a dizzying rate, more and more of our everyday routines are moving online, where they can be performed with greater ease and efficiency. Cryptocurrencies are digital representations of valuable intangibles utilised in a variety of online applications and networks, from social media and online gaming to virtual economies and peer-to-peer systems. In recent years, virtual currency’s adoption has skyrocketed across a wide range of platforms. Cryptocurrency is a decentralised digital currency that uses an encrypted peer-to-peer network to facilitate digital trade. In its role as a disruptive technology, Bitcoin, the first and most popular cryptocurrency, is challenging the status quo of traditional financial payment methods that have remained mostly unchanged for decades. Even though cryptocurrencies aren’t going to replace fiat currency anytime soon, they have the potential to alter the way in which Internet-enabled global markets interact with one another by removing the restrictions surrounding standard national currencies and exchange rates. This article delves at the wide-ranging effects of Bitcoin and other cryptocurrencies on current banking structures and the possible disruptions they could cause in the future. To better comprehend the economic ramifications of cryptocurrencies and block chain technology and how they can affect future economic systems, this study is a great resource for policymakers, financial professionals, and academics.

Priti Rai, Deepa Gupta, Mukul Gupta
Detecting Skin Cancer Disease Using LSTM (RNN) Based on a Modified Electromagnetic Field Optimization Algorithm

Skin cancer is a widespread malignant disease that is present in the skin tissue. Skin cancer is a life-threatening disease, and preventing it in the early stage is essential to save the life of patients. Detection of skin cancer early can cure the disease and reduce mortality rates by early detection with accurate diagnosis, and it is the key objective of this research. Recent studies focused on machine learning (ML) and deep learning (DL) algorithms to diagnose skin cancer using dermoscopic images. This study employs a computer-aided detection mechanism of skin cancer using the LSTM model for accurate classification. The LSTM is improved based on the Modified Electromagnetic Field Optimization Algorithm (MEFOA) to adjust the parameters and increase the accuracy level. The LSTM-MEFOA model uses the MNIST dataset with 10,000 images for classification. The proposed work attained an accuracy of 98.99%, precision of 97.87%, recall of 98.65%, and F1-score of 94.54%. The experimental outcome proves that LSTM-MEFOA outperforms well and achieves a high-accuracy level evaluated to the present state-of-the-art techniques.

E. Gangadevi, M. Lawanyashri, Rajesh Kumar Dhanaraj, Selvanayaki Kolandapalayam Shanmugam, Balamurugan Balusamy, K. Santhi
Hyper-parameter Tuning of CNN Using Improved Elephant Herding Optimisation for Detection of Skin Cancer

Skin cancer is one of the top three tumours caused by DNA damage and has a dire prognosis. As a result of this injured DNA, cells start to grow out of control and are currently doubling in size. This study recommends using a hyper-parameter optimised convolution neural net to identify the kind of skin cancer. In this method, the improved elephant herding optimisation (IEHO) algorithm was used to optimise CNN's hyper-parameters. A nonlinear denoising method known as a bilateral filter can lessen noise while maintaining edges. In this study, we frequently use the bilateral filter to modify the texture and noise in skin cancer photo images. An improved elephant herding optimisation has been proposed (IEHO). When the separation operator was changed to the sine–cosine, opposition-based learning was incorporated. The proposed model, which is around 4 and 1% higher than other based models, may produce testing accuracy up to 99.33%, according to simulation results. The experimental findings unequivocally demonstrate that the recommended model outperforms other published models.

V. Asha, N. Uma, G. Siva Shankar, Balasubramanian Prabhu Kavin, Rajesh Kumar Dhanaraj
An Ensemble-Based Approach for Smart Detection of Heart Diseases Symptoms

Machine learning is widely used in health care. Heart disease is the world's most dangerous chronic life-threatening disease, and with heart disease rates on the rise, we need to set up a system that can detect and prevent heart disease symptoms early on. Various and many algorithms in machine learning are being used to predict the early symptoms of coronary heart disease. But the majority of those suffer from low accuracy rates. In this particular study, an ensemble-based approach is utilized to train the model using the dataset on heart diseases obtained from the UCI repository. The model captures various heart disease data from different sensors and then uses an ensemble-based random forest algorithm to accurately detect the symptoms pertaining to heart disease risks in patients. The model’s accuracy is quite satisfactory at 93.8% with the use of 40 decision trees in random forest. Thus, the proposed model can serve as a reliable tool for medical experts in detecting heart disease symptoms with greater accuracy.

Prasanalakshmi Balaji, Shivam Bansal, Subham Banerjee, Sushruta Mishra, R. Sridevi
NSGA-II-MOGWO: A Novel Hybrid Algorithm for IoT-Fog Environment Resources Allocation

Resource allocation in the IoT-Fog environment is a challenging and critical problem with profound implications for application performance and service provider profitability. Efficiently distributing tasks across fog nodes enhances quality of service (QoS) metrics like latency for application users and reduces network resource utilization for service providers. This thesis proposes a hybrid meta-heuristic optimization approach, combining non-dominated search genetic algorithm II (NSGA-II) and multi-objective grey wolf optimization (MOGWO) algorithms, to address resource allocation. Comparative evaluations against five multi-objective optimization algorithms reveal the superiority of the proposed NSGA-II-MOGWO algorithm in resembling the Pareto front and generating diverse solutions for benchmarking functions. Additionally, when tested against cloud-based resource placement and NSGA-II, the proposed algorithm significantly improves users’ QoS metrics (e.g. time latency and response time) and reduces service providers’ costs for computation resources utilization and energy consumption. The implementation utilizes the Python fog simulator YAFS.

Balasem A. Hussein, Soukaena H. Hashem
Low-Cost Fractal-Based Multi-band Microstrip Bandpass Filter

This paper introduces a design of a flat microstrip tri-band bandpass filter that is based on the first iteration Peano fractal. The structure proposed utilizes coupled lines and a loaded a symmetric two-coupled line, which put as a potential solution for wireless communication applications. A novel bandpass filter with high selectivity and good response characteristics was designed. The tri-band filter under consideration was subjected to simulation and fabrication on a substrate made of Rogers RO4350B material. The dimensions of the proposed filter are calculated to be (0.22 × 0.23) $$\lambda_{g}^{2}$$ λ g 2 . The results of the simulation demonstrate that the tri-band bandpass filter, as proposed, exhibits three distinct passbands located at frequencies of (2.33–2.57), (3.73–3.85), and (5.57–5.94 GHz). The insertion loss values are 0.8 dB, 1.9 dB, and 1.2 dB minimums, respectively. The simulated findings show that the suggested filter has favorable transmission and reflection characteristics, together with a notable level of out-of-band rejection. There is reasonable agreement between the measured and projected simulated findings. Several advantages attributed to this filter, including low loss, compact dimensions, and significant attenuation across the three passbands.

Nagham R. Oleiwi, Mohammed F. Hasan
Design of Deep Learning Methodology for Side-Channel Attack Detection Based on Power Leakages

The assessment of a cryptographic implementation's worst-case security is critically reliant on the attacks that have been profiled. A substantial amount of attention has been paid to the development of profiled attacks during the last sixteen years. These attacks have included both deep learning-based and template-based attacks. On the other hand, information in the frequency domain can be lost since the bulk of attacks happen in the time domain. In this study, we propose a new side-channel attack in time–frequency representations based on deep learning to make greater use of leakage information. These representations are the target of the attack. In order to fully use convolutional neural networks in profiled attacks, we concurrently extract high-level key-related information from spectrograms and employ time–frequency patterns. This enables us to use these networks’ potential to the fullest. In order to execute successful attacks, it is first necessary to set up a functional network architecture. Second, several crucial spectrogram parameters are reviewed in order to improve the network's training. Additionally, utilizing available datasets in the temporal and time–frequency domains, we contrast CNN-based attacks with template attacks. The heuristic findings from these experiments provide a fresh viewpoint and show that CNN-based attacks in spectrograms could effectively replace state-of-the-art profiled attacks. The results of the tests led to these discoveries.

Hassan Jameel Mutasharand, Ammar Abdulhassan Muhammed, Amjed A. Ahmed
Brain Activity During Logical Thinking: A Single Lead EEG Approach

Complex cognitive functions such as cognition through thinking, apprehension, learning, and computation are required while attempting to solve logical thinking/ reasoning problems. Human logical thinking and decision making are studied by numerous academic fields in an effort to decipher their underlying neuro-functional/ anatomical mechanisms. The great temporal resolution of electroencephalography (EEG) makes it an infamous brain mapping tool in contemporary neuro-engineering research to investigate the underlying brain activation dynamics during the execution of a certain task. In this proposed research paradigm, an attempt was made to investigate the characteristics of the underlying brain activation power dynamics associated with correct and incorrect responses to a questionnaire based on logical problems for the corresponding EEG frequency bands. Using the Welch’s Power Spectral Density method, it is observed that the band power strength of the Beta band is impressive for correct answers, whereas EEG responses indicating incorrect answers exhibit high Gamma band power. High band power intensity in the Beta frequency band for correct responses correlates with the participant’s aptitude for solving logical problems and indicates a conscious, vigilant mental state. As a result of a greater concentration span and underpinning cognitive exercises in the brain, it is anticipated that incorrect answers will result in increased Gamma frequency band activation.

Uddipan Hazarika, Bidyut Bikash Borah, Priyanka Choudhury, Satyabrat Malla Bujar Baruah, Soumik Roy
Deep Learning-Based Productivity Analysis for Oryza sativa with Decision Support System

Out of the executive collection of crop diversity, Oryza sativa is one of the vital staples in our daily life. The plant O. sativa, better known as rice, is crucial to human society, culture, and nourishment, and it plays a key role in our daily life. O. sativa's (rice's) growth is intricately linked to the interaction of three important environmental elements: soil, rainfall, and temperature. These elements work together to provide the circumstances required for productive rice farming, underscoring the critical role they play in the development of this crucial crop. Farmers grow O. sativa in millions of hectares throughout the region, and many landless workers derive income from working on these farms. Soil, rainfall, and temperature play an important role for farmers in the proper growth of O. sativa. An ambient intelligent technology is introduced for agriculture where deep learning is used to classify the diseased leaves and then the loop will be executed to check the cause of the disease and fluctuation the fertilizer dose accordingly. We have used Raspberry Pi3 to monitor the soil variation and to check the disease and pest moments YOLO v3 algorithm is implemented along with the hue saturation value with radial basis function network for soil variation and fertilizer dose check. In order to give farmers, agronomists, and other stakeholder’s timely and accurate information for making informed decisions about crop management, resource allocation, risk assessment, and overall farm productivity, it combines a variety of data sources, analytical models, and computational algorithms. We have implemented ambient intelligence and analyzed the correlation or O. sativa with soil, humidity, temperature, and pH. Value is to understand the growth and disease of the crop in the research-related area. With the AgriDSS, we have received the accuracy of 99. 20% at 50 epochs with the loss variation of 0.0601%. Our research will help the farmers to take the precautionary measures, and the productivity will be increased accordingly.

Nikita Soren, P. Selvi Rajendran
Improving Emotion Recognition in Audio Signals: Leveraging Novel Features and Deep Learning for Improved Classification

Recently, there has been interest in classifying emotions using audio inputs and machine learning methods. Because a single statement might be delivered in a variety of emotional circumstances, textual data alone is insufficient for identifying the emotions, necessitating the adoption of novel feature extraction approaches. This paper aims to provide an improved framework which uses a detailed study on audio sound features like mel-frequency cepstral coefficients (MFCCs), zero crossing rate (ZCR), spectral bandwidth and many other features for audio analysis and uses the same for predicting emotions. The paper also emphasises on various audio augmentation techniques to improve the generalising power of the model. With the above-mentioned improvements in the feature vector and a carefully selected deep neural network, state-of-the-art classification accuracy is achieved on the two state-of-the-art datasets Toronto emotional speech set (TESS) and the Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS). The paper is concluded with the improved approach for emotion recognition using audio signals and leads to future scope of efficient speech recognition applications.

Poonam Chaudhary, Neeraj Choudhary
IoT-Based Health Monitoring System for Post-Covid with Diabetes

Internet of Things (IoT) devices are proliferating in the modern world, and this has a significant impact on healthcare. It can lower the price of medical care and offer early diagnosis of health issues. A patient that needs to be watched over constantly needs a healthcare monitoring system. In order to save their lives in dire circumstances, elderly patients, in particular, should periodically have their health state checked and informed to the doctor. Quiet people suffer the negative effects of true medical issues because of a lack of adequate health monitoring. Many IoT devices are available today to autonomously monitor patient health in order to address this problem. As of now, there is no other significant and adequate framework, to predict the impact of Covid-19 infection in diabetes patients. The literature has shown a separate ML/DL model for Covid-19 prediction and diabetes severity analysis. This is mainly because of the large gap in the database model. This paper has a detailed study of the machine learning (ML) model combined with IoT devices and the proposed framework would have been used for diabetic patients who were affected with Covid-19 and other symptoms for early prediction.

P. Pankaja Lakshmi, M. Sivagami
Classification of Speech Signal Using CNN-LSTM

Speech categorisation is a broad study that has attracted a lot of interest recently. Understanding other people's emotions and responding appropriately is the largest distinction between robots and people. Speech signals are categorised in this article using both conventional machine learning methods and deep learning algorithms. Speech emotion is categorised using the interactive emotional binary motion capture (IEMOCAP) and Ryerson affective voice and audio-visual database of songs (RAVDESS) databases. The expanded dataset is mixed with the voice signals once they have been transformed into spectrograms. This model incorporates CNN + LSTM, attention, and a novel type of spectrogram frequency distribution based on the spectrogram’s properties. Mel-frequency cepstral coefficients guarantee improved speech feature performance in tasks requiring emotion categorisation.

R. Gayathri, K. Sheela Sobana Rani, K. Aravindhan
A Novel Page Similarity Classification Algorithm for Healthcare Web URL Classification

Ease of access and low cost make the Internet an information hub for many users. But, it comes with many irrelevant contents and web pages, leading to chaos and confusion for the naïve user. The researchers try to develop algorithms to fetch relevant web pages from the web. However, there is scope for improvement of these algorithms by extracting relevant similarity-based features. Existing research utilizes standard similarity measures to find the similarity between relevant and irrelevant textual contents, which leads to less accuracy or bias. Therefore, in this study, authors have proposed a novel page similarity classification algorithm (PSCA) that classifies documents without standard similarity measures. The PSCA is based on content similarity measure (CSM) algorithm, which finds similarities between two web URLs to classify them as true or false. In addition, the authors have generated two novel features, namely new similarity feature (NSF) and set theory-based similarity feature (STBSF) based on algorithms from the literature to test their performance on newly generated datasets. It was observed that all these three algorithms showed higher performance compared to the standard similarity measure algorithm. The proposed PSCA showed an improved performance of 5% compared to all other algorithms.

Jatinderkumar R. Saini, Shraddha Vaidya
Evaluating Cloud Security Performance for Medical Data Based on Spider Monkey Paillier Homomorphic Cryptosystem

Data sharing is made easy by cloud computing, but data security is a major problem because of online threats. For patient privacy and to guarantee data security, protecting medical data on the cloud is essential. A potential cryptographic method called homomorphic encryption enables computations to be conducted on encrypted information without the need for the decryption maintaining privacy of information. In this study, we propose an efficient optimized spider monkey-based Paillier homomorphic cryptosystem model, referred to as SP-PHC, for cloud-based data security of medical records. We gather medical datasets from the data repository and incorporate them into the envisioned SP-PHC scheme to protect against unauthorized access by third-party attackers. Sensitive medical data are encrypted using the Paillier homomorphic encryption (PHE) technique to maintain privacy. Our model not only strengthens the security of sensitive health data but also makes it easier to analyze the information in a way that preserves privacy in the cloud. Empirical findings underline the strength of our solution in terms of encryption and decryption, reaffirming its validity in protecting medical data in the cloud.

D. Kalpana, K. Ram Mohan Rao
Wearable Textile Antenna Design for Body Area Network Applications at 2.45 ghz Frequency Band with Conductive Fabric Patch

The textile patch antenna designs of simple traditional, rectangular by three vertical slots are incorporated. In the same patch, for enhancement of impedance matching further variation with limited corners at both bottom corners (stage-2) and two parasitic elements are added in (stage-3) for operation in the frequency band 2.4–2.5 GHz this frequency band is useful for many applications like industrial, scientific, and medical (ISM) band. This variable textile patch antenna design is used for body area network applications. To achieve long distance communications, the radiated power of the variable antenna is increased. The back radiation must below specific absorption ratio (SAR) to provide human safety and also presented an antenna with design improvisations is operating at 2.45 GHz in ISM band. The design consists of a microstrip line fed wearable textile patch antenna with a dimension of 80 × 84 × 2 mm3 and denim jeans substrate of dielectric constant 1.6. The practical results are proved by computing the SAR impact with 2 human body “tissue models” for human upper arm (humerus) and the back. Finally, the comparison between simulated and measured results is observed.

A. Rajesh, R. Vani, B. Rama Rao
Handwritten Digit Recognition Using Machine Learning Classifier

Handwritten digit recognition technology is a process of identifying digits automatically using a computer or another platform. It holds significant promise applications in Optical Character Recognition (OCR), biometric and signature verification, and suspect identification. This study utilizes the widely recognized MNIST handwritten digital database as the dataset and deliberates algorithms: K-Nearest Neighbors (KNN), Support Vector Machines (SVM), Backpropagation Neural Networks (BPNN), Convolutional Neural Networks (CNN), and deep learning for digital recognition. The research employs KNN, SVM, BPNN, and CNN with TensorFlow. The algorithm’s parameters are fine-tuned to attain optimal results for each method. This study aims to assess algorithm performance, comprehend their capabilities, and gain insights into real-world applications. Lastly, an analysis is conducted by evaluating the accuracy of recognition and duration of four algorithms, providing insights into the strengths and weaknesses of each method in the context of handwriting recognition.

Sakshi Singh, Aditi Yadav, Sonam Gupta, Pradeep Gupta
An Encryption Scheme and Performance Results for IoT-Based Healthcare Service Framework

Healthcare monitoring such as heart rate, oxygen saturation (SpO $$_2$$ 2 ) and body temperature is used in various applications from consumer lifestyle practices to patient monitoring. It is essential to have secure data transfer from sensor to monitoring dashboard for healthcare monitoring. In this study, an encryption scheme is proposed for a healthcare framework and a benchmark study is conducted with C, Java and JNI implementations on the gateway. We performed benchmarks in each application using the same data and hardware to compare the runtime speeds of the same encryption algorithm with C, Java and JNI.

H. Hakan Kilinc
EpiAssist: Wearable Band to Predict Tonic-Clonic Seizures Using Multivariate LSTM Autoencoder

Unconsciousness is a general phenomenon during epilepsy seizures which might lead to serious accidents and even death if help is not provided on time. Epileptic patients cannot work independently as there is always a fear of seizure which might also endanger the people surrounding them, leading to social isolation of the patient. We propose an easy-to-use wearable band to monitor the body’s vital statistics $$24\times 7$$ 24 × 7 , viz. heart rate, oxygen level, temperature, muscle spasms, and efficiently correlate these vital stats with a normal profile recorded for the individual. The idea is to have a cost-effective personalized prediction regime for the individual by collecting data sequentially and deploy a multivariate long short-term memory autoencoder model on AWS SageMaker with the data from the sensors and subsequently improving the algorithm’s performance with the incoming data in real time. Once a seizure is predicted, the device alerts a designated caretaker about an upcoming seizure and will also call the emergency service in case of “Amber Alert” stage. This occurs when seizures last longer than 3 mins and epilepsy could be life-threatening without immediate medical assistance. The best feature is that the band can even work in offline mode after training phase of band is over. The paper depicts the working of the band and its integration with our mobile application and cloud services like Amazon SageMaker for deploying LSTM autoencoder model.

Anmol Sharma, Mannan Bhola, Hargobind Singh
Automatic Depression Detection Using Word Embedding and Deep Learning

Depression is a crucial problem in mental healthcare, due to its broad prevalence and grave effects on people’s lives. Individuals feel free to write about their mental state on online platforms, so a wealth of user-generated content is available on social media platforms. This research aims to identify mental health problems from social media texts. In this study, an innovative depression detection model using word embedding and deep neural network is built. The proposed study identifies people with depression with 72% accuracy and F1-score of 0.62. The experiment shows early and precise diagnosis, prompt intervention, and support for those who are at risk, all of which improve the overall well-being and mental health outcomes of people who are afflicted.

Jyoti Singh, Amita Jain
Wearable IMUs: Advancing Human Motion Analysis with Deep Learning

In recent times, there has been a notable surge in the exploration of studying human body movements through the utilization of inertial measurement units that can be worn. This trend stems from its substantial impact on academic and industrial circles, encompassing applications ranging from portable healthcare solutions to sports and interfacing with computers through human interaction. Developing human activity recognition (HAR) systems that achieve remarkable accuracy in identification while relying on just one sensor remains an ongoing technological hurdle. In this paper, we explore both supervised and partially supervised approaches using convolutional neural networks and denoising autoencoders approaches. The goal of our study revolves around increasing the classification precision pertaining to previous related works and decreasing reliance on human-engineered features. Since the raw IMU data is composed of time series, it is split using a running window approach and segments of roughly one second of data are fed to the neural networks. The above approaches are tested with two different variations of the original dataset, obtained with data augmentation approaches, to balance uneven class distribution. The dataset contains measurements of one single IMU sensor positioned in the belt of different users performing seven actions: running, jumping, walking, falling, sitting, standing, and lying. The score of best 8-layer CNN-based system was 0.972. The worst represented class, falling, with just 2 min of recorded data, has an F1-score of 0.919.

Satyesh Das, Divyesh Das, Ashana Parashar
PDEDX: A Comprehensive Expert System for Early Detection of Parkinson’s Disease

The timely identification of Parkinson’s disease (PD) holds significant importance due to its lack of a curative medical procedure; instead, it can only be managed through medical intervention during its initial phases. However, pinpointing PD in its early stages poses a considerable challenge, given the absence of a definitive diagnostic test. A neurology specialist relies on the patient’s medical background and symptoms for PD diagnosis. Since this procedure is not systematic and is solely based on human experience, it is prone to human errors. To solve every one of these issues, Parkinson’s Disease Early Diagnosis with Extra-Trees (PDEDX) expert system is proposed in this research for the early diagnosis of Parkinson’s disease. The expert system, as proposed, applies the oversampling SMOTE technique to address the issue of class imbalance, Boruta feature selection to perform feature selection, and the classification is done using bagging ensemble Extra-Trees (ET) to overcome the variance and overfitting problem of the single-classifier-based model. The suggested model, PDEDX, demonstrates a notably superior performance in terms of accuracy and F1-score when compared to a range of single-classifier-based models, ensemble models, and other models documented in the literature.

Saptarsi Sanyal, Shanmugarathinam, Naveen Vijayakumar Watson
An Analytical Perspective of Machine Learning Predictive Models to Diagnose Chronic Diseases

Machine learning techniques have significantly impacted healthcare by enabling early detection of chronic diseases, leading to slightly reduced global mortality rates. However, the complexity of models and vast datasets requires careful selection and utilization of appropriate predictive models for specific diseases, leading to enhanced outcomes in healthcare systems. This article discusses recent research findings on the performance of various predictive models and their practical applications, providing insights for further advancements in healthcare.

Rattan Pal Singh Rana, Sudhanshu Gupta, Umesh Gupta
Analyzing the Performance of Photovoltaic Modules Using PVsyst Software Under Realistic Operating Conditions in Iraq

Photovoltaic (PV) modules are typically characterized by their I–V and P–V curves at fixed temperatures and varying irradiances. The module temperature, influenced by irradiance, is essential in simulating real-world conditions for PV modules. This study focuses on utilizing PVsyst software to evaluate the performance of polycrystalline PV modules in accordance with the specific requirements of Iraq such as dust and high temperature. Through an analysis of the module’s behavior under actual conditions, it was observed that the power output of the PV module experienced a 14.1% reduction compared to the control under Standard Test Conditions (STCs) due to the temperature rise caused by increased solar irradiation. Subsequently, when solar irradiance dropped to 200 W/m2, this reduction percentage escalated to 82.3%. Moreover, the rise in temperature adversely affected the module’s efficiency, although the influence of temperature exhibited a linear reduction on efficiency. This research provides valuable insights into the performance of polycrystalline PV modules and the impact of temperature variations on their power output and efficiency, contributing to the optimization of PV systems in Iraq. The results shown by this study contribute to helping to design solar energy systems in the future, considering the losses due to temperatures, for the purpose of providing additional energy output capacity to compensate for the losses.

Amer Saad Abbas, Ali Nasser Hussain, Abdulrahman Th. Mohammad
Energy Harvesting with Network Coding Technique of Radio Over Fiber: Toward Minimizing of Outage Probability

Energy harvesting allows sensors on the remote area to harvest energy without physical connection. In this work and for the first time, the performance of the radio over fiber (RoF) with energy harvesting (EH) of cooperative communication technique is analyzed. The impact of the small large-scale fading and large-scale fading is mathematically modeled. We present outage probability as performance metric for the radio over fiber (RoF) with energy harvesting (EH) of cooperative communication technique. Finally, the results show that cooperative communication improves the system over traditional direct communication with energy harvesting.

Shakir Salman Ahmad, Hamed Al-Raweshidy
Diagnosing the Early Stages of Alzheimer’s Disease by Applying the Modified Ant Colony Optimization Technique

Treating Alzheimer’s Disease (AD) and preventing further degeneration are becoming increasingly important. Doctors who could view many morphological aspects for improved clinical practices would evaluate patients more extensively. Previous studies have shown the value of applying deep learning to T1-weighted MRI images to differentiate AD from Normal Control. This paper proposes the classification of three binary classifications by applying 3D CNN Network. A Modified Ant Colony Optimization (MACO) technique is proposed for optimizing the different weights of the network. The effectiveness of the proposed model is assessed by using a sample of 259 ADNI individuals.

Rashmi Kumari, Subhranil Das, Raghwendra Kishore Singh
Bi-objective Enhanced Index Tracking: Performance Analysis of Meta-heuristic Algorithms with Real-World Constraints

Enhanced index tracking problem (EITP) aims to add sustainable value to portfolio management by emulating the behavior of the benchmark index while limiting the number of assets it holds. There have been numerous approaches to the EIT problem. Despite this, producing a high-quality portfolio continues to be a challenge. We examine bi-objective enhanced index tracking, which takes both the anticipated excess return of a portfolio associated to the benchmark and the degree of deviation from the benchmark, known as tracking error. In this approach, tracking error and excess return are used to measure the optimization of the problem. The objective of this study is to evaluate the relative efficacy of prominent meta-heuristic evolutionary algorithms NSGA-II, SPEA2, MOEA/D, and MOPSO in addressing enhanced index tracking problems with real-world constraints like a budget constraint, a bound constraint, and a cardinality constraint. Due to these constraints, this problem becomes nondeterministic polynomial-time hard. The paper examines the computational results of real-world benchmark instances. Our proposed methodology has been applied to five data sets derived from significant global markets. Preliminary empirical results show that NSGA-II outperforms other multiobjective evolutionary algorithms.

Ibtesaam Rais, Shahzad Alam, Chanchal Kumar, Suraj S. Meghwani
An EETR Approach for Therapeutic Response Prediction Using Gene Expression and Drug Properties

In recent years, the role of computational methods such as machine learning and deep learning has evolved to help better understand an individual’s response to drugs. Through advancements in the discipline of precision medicine, cancer therapies are made based on an individual's pharmacogenomics data. In this way, doctors can make informed decisions about patient outcomes and reduce treatment costs. Beyond these advancements, there is enough void to precisely predict drug response. A drug’s response to therapy is dependent on a number of factors such as the drug’s physicochemical properties, pharmacokinetics, metabolism, protein, mutation, somatic variation, environment, and much more. Studying the correlation of more of these factors and their association with the drug is one of the promising ways to bridge the gap in improving results. Extra tree regression is an ensemble-based algorithm and is less prone to overfitting than other ensemble learning algorithms particularly when low bias and reduced overfitting are desired, and computational efficiency is important. In this paper, we have applied the drug’s physicochemical properties to predict the Inhibition values using an enhanced extra tree regression (EETR) algorithm. The EETR algorithm prediction outperformed both the random forest and original ETR algorithm in terms of both MSE and $$R^{2}$$ R 2 score.

P. Selvi Rajendran, Janiel Jawahar
Enhancing Few-Shot Learning with Optimized SVM-Based DeepBDC Models

Few-shot learning is a challenging task in computer vision, where the goal is to classify objects or entities based on very limited training examples. In this paper, we propose a novel approach that combines deep Brownian distance covariance (DeepBDC), few-shot classification with cosine similarity (FS-CS), and support vector machines (SVM) to improve few-shot learning accuracy. We present extensive empirical results on the CUB-200 dataset, demonstrating the effectiveness of our approach. In our experiments, we investigate the impact of different similarity functions, including Euclidean distance, cosine similarity, and inner multiplication, on few-shot classification accuracy. Our results show that the SVM-based similarity function outperforms other methods, achieving the highest accuracy in both 1-shot and 5-shot settings. These findings suggest that SVM-based similarity functions can significantly enhance few-shot learning. Furthermore, we compare our approach to existing models and highlight its advantages in terms of accuracy and efficiency. Our research contributes to the field of few-shot learning by introducing a novel combination of techniques and innovative methodologies, which can pave the way for further advancements in this area.

Mohammad Reza Mohammadi, Jaafar M. Al-Ghabban, Mohammad S. AlMusawi
Motion Planning of the Autonomous Vehicles with Multi-view Images and GRUs

Motion planning is arguably the most significant module in autonomous driving. Image-based and LiDAR-based representations are very successful but costly. The proposed method considers three images from different angles to understand the ego vehicle's surroundings. The proposed approach decreases the occurrence of infractions, to reduce the violation of traffic signals. The proposed model generates a sequence of coordinates representing the waypoints in the predicted path of the vehicle for the upcoming few time steps and uses an inverse dynamic algorithm to derive the values of the driving parameters, like steering angle, throttle, and brake value, from the coordinates of the waypoints generated by the model. The CARLA simulator is used for data extraction to train and evaluate the model and simulate the motion planning. The proposed approach shows an improvement in the Route Completion and Driving Score metrics compared to the existing methods on the dataset of this simulator.

Divya Singh, Rajeev Srivastava, Umesh Gupta
Gamification for Achieving Sustainability: Trends and Future Scope

This study presents current trends and scope of gamification in achieving sustainability. We conduct bibliometric analysis of extant literature with SPAR-4-SLR method. Year-wise, author-wise, citation-wise, country-wise, source-wise, affiliation-wise, sponsoring institutions-wise, and keywords-wise listing are the parameters to analyze literature. The study concludes that gamification is employed in different business processes and different domains like health, business, tourism, and education to achieve sustainability. Gamification tools like augmented reality, virtual reality, simulation, and serious game are prominent games analyzed in literature. Hence, achieving SDGs through gamification is an interesting arena to be explored for further studies.

Swati Sharma
Decoding Twitter Spam: Exploring Modern Detection Methods and Future Prospects

Twitter serves as a popular platform for sharing and connecting, but it is also a target for individuals who disseminate unwanted content, commonly known as spam. Recently, researchers have come up with different ways to find and stop this kind of spam on Twitter. This study explored multiple researches focusing on detecting spam on Twitter. Some relied on traditional machine learning methods, while others experimented with newer deep learning techniques. We did a careful review of four groups of questions researchers asked. We chose 48 studies out of 1080 articles from important journals and conferences. The chosen 48 studies were selected based on a range of spam detection techniques, with a focus on machine learning methods such as classification and clustering. We divided all the studies into five groups about finding Twitter spam. These five groups include publication years, evaluation parameters, evaluation tools, open issues and challenges, covered years. We also checked how these studies are spread out based on different things. We also talked about the problems with finding spam on Twitter. This includes how spammers keep changing tactics, not having enough good examples to learn from, and how it is hard to find spam in different languages. We aim to spread awareness about identifying and preventing spam across various domains. We also suggest new avenues for further research to take place. Our findings give a big picture of what’s going on in finding Twitter spam. They are helpful for people who work in this area like experts and researchers.

Satinder Pal, Anil Kumar Lamba
Quantum-Secured Collaborative Machine Learning: Facilitating Privacy-Protecting Quantum Federated Learning

Federated learning (FL) is an exciting new method for securing data privacy in distributed machine learning environments. Quantum computing conditions make it far more difficult to ensure strong privacy protection in FL. Quantum federated learning (QFL) approaches are the focus of this research in order to promote private, confidential, and effective group work. It is proposed that federated learning processes incorporate Quantum Secure Multi-Party Computation (SMPC) protocols. The benefits of QFL in terms of privacy protection and model performance have been empirically demonstrated. QFL’s computational efficiency, scalability, and security are highlighted in thorough performance studies in comparison to conventional FL approaches. The findings highlight the potential for quantum processing to improve the privacy and security of FL, paving the way for decentralized, privacy-preserving quantum-secured collaborative machine learning. In conclusion, empirical evidence supports the efficacy and scalability of QFL, making it a promising option for tackling the difficulties of privacy and security in collaborative machine learning.

S. Ravikumar, E. Chandralekha, K. Vijay, K. Antony Kumar, C. Pretty Diana Cyril
Kinematic Gait Analysis Using Markerless System to Determine Joint Angles

Kinematic gait analysis is the process of determining the lower extremity joint angles to assess the quality of human movement. In several disciplines, including sports biomechanics, rehabilitation, and ergonomics, precise measurement and analysis of joint angles are essential. Traditional marker-based systems have drawbacks, such as difficult setup and constrained mobility. A viable substitute is markerless systems, which offer more use and flexibility. This paper proposes joint angle determination using 2D BlazePose, a superset of COCO keypoints, Blaze Palm, and Blaze Face topology that can run on resource-constrained computing devices like smartphones. Gait analysis using markers is expensive and not easy to access. Keeping this in mind, this developed system can be used in daily life by doctors to assess their patients. We examined our system using a dataset of videos with motion capture data. The mean absolute error (MAE) between our system and motion capture system (MoCap) of the above-said dataset for left and right leg hip, knee, and ankle angles are 3.6 $$^\circ $$ ∘ , 7.8 $$^\circ $$ ∘ , 9.1 $$^\circ $$ ∘ , 3.8 $$^\circ $$ ∘ , 9.6 $$^\circ $$ ∘ , and 7.8 $$^\circ $$ ∘ , respectively.

Mohd Irfan, Nagender Kumar Suryadevara, Rakesh Biswas, Anuroop Gaddam
Auction-Based Optimization with Machine Learning for Cyber Threat Detection and Classification Model

The detection of cyber threats and cybersecurity can be benefitted greatly from the advances in artificial intelligence (AI). Cyber threat detection is the main focus of cybersecurity. It has used various tools and techniques to find unauthorized access to systems, attacks, effective security incidents or applications, and networks. Owing to data flow presence over networks and extensive devices, the possibility of intrusion detection and cyberattacks has increased. Monitorization of this immense volume of data traffic is found to be tough, though machine learning (ML) techniques effectively support this task. This study develops an auction-based optimization with machine learning for cyber threat detection and classification (ABOML-CTDC) technique. The presented ABOML-CTDC technique concentrates on the identification and classification of cyber threats in accomplishing cybersecurity. The proposed ABOML-CTDC technique makes use of the ABO-based feature subset selection (ABO-FSS) technique to elect optimal features. For cyber threat detection, the ABOML-CTDC technique utilizes a variational autoencoder (VAE) model. Finally, the moth flame optimization (MFO) algorithm is exploited for the parameter selection of the VAE method del to improve the detection rate. The experimental validation of the ABOML-CTDC approach is tested on benchmark datasets and the results pointed out the promising results on detecting threats over recent state-of-the-art methods.

Hamed Alqahtani
Premature Ventricular Contraction Recognition Using Support Vector Machine (SVM) Based on Wireless Communication Protocols with Medical Sensor ECG

The signal under consideration is a biologically significant indicator utilized to diagnose heart disorders, as it exhibits the cyclic contraction and relaxation patterns of the myocardial muscles in the human heart. This noninvasive tool is crucial in identifying life-threatening heart conditions, such as abnormal electrocardiogram (ECG) heartbeats and arrhythmias, which can lead to death. Premature ventricular contraction (PVC), one of the most prevalent arrhythmias, arises from the ventricular region of the heart and has the potential to induce palpitations, cardiac arrest, and various other symptoms that may impede a patient’s daily functioning. To reduce doctors’ workload in assessing heart arrhythmia and disease, computer-assisted techniques are now used to diagnose them automatically. This study proposes a machine learning (ML) methodology for the identification of PVCs using the MIT-BIH arrhythmia database. The feature extraction method entails the computation of ten distinct features related to heartbeats. Three morphological aspects, including R wave amplitude, QRS complex duration, and QRS complex shape, were examined. Additionally, seven statistical features were calculated for each signal. These features were derived from 8 s of ECG data, resulting in a feature vector. A support vector machines (SVM) model was utilized to analyze these features, identify distinct patterns, and improve classification accuracy. The outcomes highlight the substantial enhancement in diagnostic performance.

Ahmed Yassin Ali, Ziad Saeed Mohammed
Enhancing the Prediction of Customers’ Satisfaction with Airline Companies Using Data Mining and Genetic Techniques

Companies compete with each other to provide the best services that could satisfy their customers. Users’ satisfaction represents a key aspect that companies aim to achieve as it draws either the success or the failure of a company. However, it is necessary to understand what can lead to customers’ satisfaction with a particular company, technology or product. This research aims at (1) predicting customers’ satisfaction with airline companies, (2) identifying the most effective features on airline customers’ satisfaction, and (3) enhancing the prediction accuracy using several different techniques. The dataset of 129,880 customers is used in this research. It includes demographic features and customers’ perceptions. Unlike previous literature, this research suggests many steps to enhance the prediction accuracy of the implemented techniques. This includes handling missing values, dealing with outliers, generating new features, applying feature selection techniques, and integrating a genetic optimizer. Two data mining techniques are applied to predict customers’ satisfaction: the random forest classifier and the K-nearest neighbor classifier. The results show that the random forest classifier, with the application of handling missing values and outliers, normalization, integrating the newly generated features, and implementing a feature selection technique and genetic optimizer outperformed the K-nearest neighbor classifier with an accuracy of 95%. Regarding the new features generated, service quality and engagement are significant predictors of customers’ satisfaction, whereas information quality was not a determinant feature. The research outcomes can help airline companies improve their services and respond to customers’ needs.

Shahad Hussein Ewadh, Ahmed Al-Azawei
Exploring Predictive Models Utilizing Machine Learning and Deep Learning Techniques for Diabetes Mellitus: A Comprehensive Literature Review

Machine learning (ML) and deep learning (DL) approaches have recently attracted considerable attention in the field of diabetes mellitus research because of their potential for predicting and understanding the nuanced facets of the condition. This comprehensive literature review analyzed the present landscape of the predictive modeling of diabetes mellitus. Innovative ML and DL methods were the focus of this study. This study analyzes the persistent problems faced by scientists and engineers working on type 2 diabetes. To systematically summarize the results of 18 carefully selected papers, this study applied the PRISMA method with the addition of methods from the Keele and Durham universities. Several ML and DL methods, including 18 distinct model types, were thoroughly evaluated. Predicting diabetes using tree-based algorithms is highly accurate. In contrast, deep neural networks have shown subpar performance, despite their inherent capacity to handle complicated and large-scale datasets. The importance of feature selection and data balance in enhancing model efficiency was also emphasized in this study as key data preparation methodologies. It has been demonstrated that models trained on structured datasets achieve unprecedented prediction accuracy. This study examined the benefits and drawbacks of numerous machine learning and deep learning approaches. Its purpose is to provide researchers and medical professionals with in-depth knowledge of where diabetes mellitus predictive modeling is at the moment and where it may go in the future. Examination of diabetes mellitus is the main topic of this review.

Lena abed ALraheim Hamza, Hussein Attya Lafta, Sura Z. Al Rashid
Impact of Value-Added Courses and Personalized Method of Delivery in Achieving Industry Readiness: An Industry 4.0 Perspective

There has been a rapid growth of technology in parallel with the massive growth in population since the past few decades. Higher education plays a significant role in driving growing performance and competitiveness in the national and international economies in line with such growth. The graduating students from majority of the higher education institutes often possess skillsets which stand insufficient to meet the needs of the industry propelling use of additional resources to achieve industry readiness and enabling them to work on real-time projects. The present study emphasizes on the need for value-added courses to bridge such industry–academia gap. The study is conducted on engineering students and hence the role of value-added courses on Industry 4.0 and related technologies in abridging the gap is explored. At the outset, a need analysis survey is conducted on engineering undergraduate students in order to validate their understanding of the existence of industry–academia gap, the need for short-term value-added course and also specifically the need for Industry 4.0-based courses toward abridging the gap and its role in contributing toward industry readiness. Considering the results of the need analysis survey and also uniqueness of strength and weakness of each student, an Industry 4.0-based curriculum is developed in coordination with academicians and industry experts. Further, a QAA training was provided to the academic experts which enabled the contributors to adopt non-traditional patterns in the design delivery and assessment of the course. Considering the input of the academic and industry experts the value-added course was designed and developed and further approved by the boards of studies members who also belonged to both the verticals. The course was delivered to 56 students by faculty members from Vellore Institute of Technology, India, Nottingham Trent University, UK, and also reputed industry experts who shared their real-time perceptions and knowledge pertaining to Industry 4.0 and enabling technologies like augmented reality, virtual reality, and digital twins. The students were assessed on traditional quizzes and digital assignments which evaluated their critical thinking and real-time problem-solving skills. The outcome assessment of the course yielded positive results wherein the students successfully attained the outcomes. The students also participated in need analysis survey post completion of the course emphasizing on soft skill development which revealed their improvement in terms of ability to focus on the main idea of the project, ability to speak and forward opinions in identifying problems and solutions, and ability to organize ideas in writing executive summary and draw conclusions from the finding. Out of 56 students who completed the course, 8 students achieved placement as interns in reputed companies which also justified the attainment of objective of the course in bridging industry–academia gap and ensuring of industry readiness.

Pratik Vyas, Sweta Bhattacharya, Y. Supriya, Dasari Bhulakshmi, Thippa Reddy Gadekallu, Rajesh Kaluri, S. Sumathy, Srinivas Koppu
Machine Learning Approaches for Improving the Accuracy of Blood Cell Detection and Subtypes Classification Using Smear Microscopic Images

The human blood contains red blood cells (RBC), white blood cells (WBC), platelets, and plasma. The entire blood cell count defines the state of health. A normal human has RBCs ranging from 4.5 to 6.0 million cells per microliter in males and 4.0–5.0 million cells per microliter in females, as well as WBCs ranging from 4.5 to 11.0 thousand cells per microliter in both males and females. The segmentation and identification of blood cells are extremely vital. The RBC and WBC counts are extremely important to diagnose varied diseases such as haemolytic anaemia, nutritional anaemias, acute myeloid leukaemia, chronic myelogenous leukaemia, and chronic lymphocytic leukaemia. The count of blood cells is performed in manual hospital laboratories using a victimisation device known as a hemocytometer and magnifier. However, this technique is tedious, laborious, and time-consuming, and it produces incorrect results due to human error. Also, there are some expensive machines, like instruments, that do not seem to be reasonable in each laboratory. The proposed method has involved the use of image processing to classify the blood cells with the help of ResNet deep neural networks. The algorithm can extract the feature of each segmented cell image and classify the types. The overall accuracy was 93.01%. The system has been developed to provide accurate and fast results using large dataset of blood smear images.

S. Pravinth Raja, Sameeruddin Khan, Shaleen Bhatnagar, Thomas M. Chen, Mithileysh Sathiyanarayanan
Enhancing End-to-End Data Security in Cloud Environment Using Data Compression, Encryption, and Data Hiding Techniques

As the use of smart devices is increasing, along with it cloud environment is also expanding rapidly. In today’s time, data is very crucial to users, but somehow the security of user’s data in cloud environment falls short of expectations. So, we need to strengthen the data security in cloud environment; to do so, we are going to use some techniques which will reduce the size of data and will provide end-to-end security to the data. We will be using Huffman encoding which will reduce the data size in a lossless manner and a binary stream processor which will further reduce the size of data. After reducing the size of data, we will encrypt the data and also calculate a digital signature to use it for maintaining the integrity of data. This encrypted data will be embedded inside an image using steganography technique. With each step will enhance the security of data and also results in reduced data size which will make the whole system more efficient.

Parvinder Singh, Dushyant, Pardeep Kumar
BotNet Attack Detection Using MALO-Based XGBoost Model in IoT Environment

The Internet of Things (IoT) has flourished and found numerous practical applications thanks to the advent of energy-aware sensor devices, as well as autonomous and brainy systems. But the Internet of Things gadgets are especially vulnerable to botnet assaults. To counteract this risk, it may be possible to design an anomaly-based detection method that is both lightweight and capable of creating profiles for both malicious and benign behaviors on IoT networks. Machine learning (ML) procedures may also be used for the mountain of data produced by IoT devices. Various tactics have been used to identify the Botnet's initial point of entry. However, the work remains challenging due to the limited sum of characteristics encompassed in Botnet datasets. Nine industrial-grade IoT devices were hacked in this study, and XGBoost and B models were constructed to identify and categorize common IoT botnet threats including Mirai and BASHLITE. Rigid hyperparameter tuning using a variant of the ant lion optimization algorithm (ALOA) formed the basis of the model development process.

Omar A. Alzubi
Hybrid Optimization-Based Support Vector Machine for Detecting the Network Attacks in IoT

Internet of Things boundary attacks must be monitored in near real-time to ensure the safety and security of critical infrastructure. In this research, we deliver an intelligent intrusion-finding scheme designed to identify assaults originating from the Internet of Things. In particular, a machine learning tactic called support vector machine (SVM) has been utilized to notice malicious IoT network traffic. The characteristics are derived using a 1D-CNN model, which stands for a convolutional neutral network. The hybrid whale dragonfly optimization method (H-WDFOA) is used to determine the best value for the SVM kernel weight. The identification solution guarantees safe operations and facilitates the interoperability of IoT connectivity protocols. When it comes to protecting a network, one of the most common forms of security equipment is an intrusion detection system (IDSs). In addition, 5G networks are under pressure to gather, analyze, and analyze enormous volumes of data traffic and network connections in order to meet the rising demand for user-centric cybersecurity solutions.

Jafar A. Alzubi
Enhanced MobileNetV3: A Deep Learning Approach for Accurate Parking Lot Occupancy Detection

We present an improved and optimized MobileNetV3 model for accurately determining parking lot occupancy. CNRPark-EXT and PKLot datasets are used for training our model and incorporate modifications to enhance its performance. Using real-time video feed, our model divides individual parking space patches into busy or empty states. The modifications include using a different activation function, employing convolution block attention instead of the squeeze-and-excitation module, and utilizing blueprint separable convolutions. Comparative experiments with CarNet and mAlexNet demonstrate that our model achieves an average accuracy of 98.01%, surpassing previous state-of-the-art solutions. Our findings suggest the potential of the proposed model for real-time applications.

Yusufbek Yuldashev, Sadriddin Khudoikulov, Abrorjon Kucharov, Rashid Nasimov, Akmalbek Abdusalomov
Personality Traits of Online Medication Shoppers

Using a large representative sample of the Swedish population, the present study aimed to explore the relationship between online shopping of medicines and personality traits (i.e. degree of Openness, Conscientiousness, Extraversion, Agreeableness and Neuroticism). In total, 1622 persons responded to a questionnaire, including measures of online shopping of medicines and personality traits. Our findings indicated that online shopping of medicines is associated with high degree of Openness and Neuroticism. The study relies on two different personality inventories that revealed the same result.

John Magnus Roos, Pernilla Bjerkeli
Deep Reinforcement Learning-Based Energy-Efficient Aggregation Model for Wireless Sensor Network

The proposed work discusses aggregation methods for deep reinforcement learning (DRL)-based wireless sensor networks. The aggregation algorithm plays a vital role in reducing overall energy consumption. The aggregation method aggregates the data samples, removes the redundant data, and reduces the number of overall packets, reducing overall energy consumption. We begin with the introduction, followed by the discussion on routing protocols with aggregation for MWSNs, which presents an overview of different routing protocols with aggregation. Optimized Link State Routing (OSLR) performance without aggregation is compared with the OLSR protocol with aggregation. Subsequently, routing protocols based on SOM concerning WSNs are elaborated. The SoM-OLSR with aggregation is reviewed and compared with the SoM-based routing protocol without aggregation for MWSNs. Similarly, the DRL with aggregation is reviewed and compared with the DRL-based routing protocol without aggregation for MWSNs. The result proves that DRL-OLSR ‘with aggregation’ saves 50% of energy with respect to ‘without aggregation.’ The results prove that the average energy saving is more than 30% for a high-density network. Thus, we can conclude that the proposed DRL-based aggregation-based method performs much better with respect to other existing methods.

Nada Alsalmi, Keivan Navaie, Hossein Rahmani
NeuroFlex: An IoT-Integrated Glove for Personalized Post-Stroke Rehabilitation with an Explainable Machine Learning-Based Monitoring System

Post-stroke rehabilitation is pivotal for individuals aiming to regain lost functionalities and improve their quality of life. The integration of innovative technology can enhance this recovery journey. “NeuroFlex”, an IoT-integrated glove, is introduced to aid post-stroke individuals. It has sensors to capture real-time data on hand movements and strength, which is sent to a centralised server. This server, equipped with machine learning algorithms, tailors rehabilitation exercises based on the patient's progress. Such personalization ensures an optimised recovery pathway tailored to each patient's needs. Furthermore, NeuroFlex transmits this data for processing through an explainable machine learning-based monitoring system. This AI framework not only tracks patient progress but also offers insights into its decision-making, enhancing transparency and trust in the rehabilitation process. This offers insights into potential issues, paving the way for timely interventions that can optimise outcomes and avert possible setbacks. Early trials suggest that NeuroFlex surpasses traditional methods in terms of rehabilitation speed and user satisfaction. It represents the fusion of wearable tech, IoT, and machine learning, revolutionising personalised post-stroke care. Additionally, its remote controllability offers opportunities for teleconsultations with therapists, enhancing patient accessibility and convenience.

Krishnaveni Sivamohan, Sivamohan Sivanandam, S. Nagarani, Thomas M. Chen, Mithileysh Sathiyanarayanan
Empirical Validation of Software Defect Prediction Models Using Grid Search Gray Wolf Optimization

Software engineering principles deal with the quality of software in consideration of various attributes. Software defect prediction is one of the key domains in the field of software engineering. Software defect prediction aims at providing non-defective softwares to the end user. In order to ensure the reliability and robustness of software, defects must be identified at each stage by conducting various review activities. In order to analyze the quality of software various defect prediction models are developed. In this study, a defect prediction model is modeled using Traditional Gray Wolf Optimization (TGWO) and Grid Search GWO (GSGWO). The performance of defect prediction models is analyzed on the basis of Within-Project Defect Prediction (WPDP_GWO), tenfold_GWO, TGWO, and GSGWO. Further, homogeneous cross-project defect prediction model is also developed under the consideration of predictive modeling. The performance of prediction models analyzed using AUC performance metric. Statistical tests are used for the validation of prediction models. Thus, RFGWO performed best for CPDP and GSGWO performed best for WPDP.

Ruchika Malhotra, Shweta Meena
An Effective Deep Q-Learning Architecture for Class Mapping in Multi-domain Software-Defined Internet of Things

There are specific communication requirements for each Internet of Things (IoT) application, such as jitter, packet delivery ratio (PLR), and latency. In heterogeneous networks, an end-to-end (E2E) route may pass across several domains with various quality-of-service (QoS) traffic classes in each domain, making it difficult to meet these unique E2E QoS criteria when they are present on the E2E path of IoT networks. This paper provides a hierarchical software-defined networking (SDN) architecture employing deep Q-learning and a multi-criteria decision-making (MCDM) scheme to determine the appropriate QoS class for the E2E route in the SD-IoT heterogeneous network and performs its mapping for the E2E service requests. The architecture is composed of two layers. In the global controller, we map the appropriate service classes for supplying E2E QoS in accordance with application service requests. Our proposed framework is demonstrated with different real Internet topologies.

Jehad Ali, Jisi Chandroth, Byeong-hee Roh
AI-Based Optimization Method for Efficient Placement of VNF in Cloud-Edge Computing

The efficient placement of Virtual Network Functions (VNFs) has emerged as a critical challenge in cloud-edge computing environments due to the growing demand for low-latency and high-bandwidth services. The present study investigates diverse optimization techniques for Virtual Network Function (VNF) placement with the objective of reducing latency, optimizing resource allocation, and improving overall network performance. This article presents a comprehensive examination of cloud-edge computing, emphasizing the significance of virtual network function (VNF) placement optimization. We propose a Remora Optimization Algorithm for Virtual Network Function (ROAVNF) placement technique is a revolutionary way to improve the Service Function Chains (SFCs) sequential layout. The utilization of maximum computer resources by the ROAVNF method has been demonstrated to enhance performance in comparison to state-of-the-art methods. This has been evidenced through extensive simulation on edge network, which has been evaluated across five metrics, namely energy consumption, throughput, resource cost, execution time, and end-to-end delay. The comparative analysis has revealed that the energy consumption rate of the proposed algorithm surpasses that of GA, BGWO, and OCPS by 56.36%, 29.34%, and 11.78%, respectively.

Abadhan Saumya Sabyasachi, Sangeeta Kumari, Biswa Mohan Sahoo
A Safeguard and Safety in E-Health Records: A Committed Clinch

Medical gadgets and the improved usability of remote health services have made mobile healthcare and medical services quite popular. The patient's accessible interest in their care and their accessible moral outrage over it combine with these. As a result, a tone of medical data is produced. These datasets need to be transferred, archived, and accessed securely. In this study, we have proposed a practical method for maintaining the identity of clinical data while safeguarding privacy through an effective encryption system. We have also discussed an authorization framework that uses different levels of access. Various entities frequently have access to medical records with differing levels of authorization. We have also discussed a proposed authorization model that uses different levels of access. Various entities frequently have access to medical records with differing levels of proposed model. In this study, we managed patient data access control mechanisms according to the artificial intelligence (AI) scheme. The framework for access control is created using the machine learning (ML) access model. The main justification for this architecture is the ML-based restriction on accessing the data through AI. Additionally, medical data encryption utilizing various authorization approaches has become necessary for proper data access regulation. It could affect user confidence in the e-health paradigm and, as a result, improve usability on a large scale.

Khulood Abdel-Khaliq Al-Salim, Ahmed J. Obaid
Ovulation Day Prediction Using Machine Learning

Menstrual cycle tracking and ovulation prediction are vital steps in a woman’s reproductive health and family planning. This research paper studies the advancement of menstrual cycle and ovulation prediction using machine learning techniques. We are using a large dataset with 17,985 records and has 117 features, which include estimated ovulation, luteal phase, tracking factors and various factors related to intercourse. Through data preprocessing, we address duplicated data, missing values, outliers, and feature engineering to optimize the dataset for predictive modeling. Decision trees, support vector machines, and neural networks are some of the many machine learning algorithms. Still, we are using random forest and linear mixed models, assessed for their predictive accuracy and generalization capabilities, which are key factors in improving its usability. These results significantly improve menstrual cycle and ovulation prediction compared to conventional calendar-based methods. They provide enhanced accuracy, precision, and external validation to ensure the model is reliable and adapts to people from different parts of the world. The potential applications extend to fertility management, contraception, and healthcare interventions, which enable individuals to make informed reproductive health decisions. In summary, the potential of machine learning in advancing the menstrual cycle and ovulation prediction offers a reliable tool for reproductive health and family planning. More research and clinical validation are essential for implementing such predictive models in real-world health care.

Umesh Gupta, Rohan Sai Ampaty, Yashaswini Gayathry Amalapurapu, Rajiv Kumar
Metadaten
Titel
Proceedings of Third International Conference on Computing and Communication Networks
herausgegeben von
Giancarlo Fortino
Akshi Kumar
Abhishek Swaroop
Pancham Shukla
Copyright-Jahr
2025
Verlag
Springer Nature Singapore
Electronic ISBN
978-981-9726-71-4
Print ISBN
978-981-9726-70-7
DOI
https://doi.org/10.1007/978-981-97-2671-4