Skip to main content

2025 | Buch

Innovative Computing and Communications

Proceedings of ICICC 2024, Volume 3

herausgegeben von: Aboul Ella Hassanien, Sameer Anand, Ajay Jaiswal, Prabhat Kumar

Verlag: Springer Nature Singapore

Buchreihe : Lecture Notes in Networks and Systems

insite
SUCHEN

Über dieses Buch

This book includes high-quality research papers presented at the Seventh International Conference on Innovative Computing and Communication (ICICC 2024), which is held at the Shaheed Sukhdev College of Business Studies, University of Delhi, Delhi, India, on 16–17 February 2024. Introducing the innovative works of scientists, professors, research scholars, students, and industrial experts in the field of computing and communication, the book promotes the transformation of fundamental research into institutional and industrialized research and the conversion of applied exploration into real-time applications.

Inhaltsverzeichnis

Frontmatter
Forecasting Bitcoin Price in Indian Rupees Using Machine Learning Techniques

The cryptocurrency is the encrypted, digital and peer-to-peer currency invented using blockchain technology in 2009. It is implemented as medium of exchange between computers of the network without interference from any centralised authority. The Bitcoin is most widely used and valuable cryptocurrency across the world. In India, also many people prefer the Bitcoin for their investment. People want to be more aware of the possibilities and opportunities that cryptocurrencies can present, to maintain the confidence and trust rate of utilising cryptocurrencies. The goal of this paper is to predict the future value of Bitcoin cryptocurrency in Indian Rupees (INR), with machine learning using Python. The dataset of approximately past 768 days from current date is trained to predict the INR value of Bitcoin for next 10 days.

Kamran Siddique, Pradeep Kumar
Machine Learning Methods for Time Series Data Processing in Air Quality Detection

Air quality shows the degree to which the air is clean and suitable for living creatures and the environment. Recently, developing countries like India have been experiencing massive air pollution. The main cause of this poor level of air is the presence of dust particles, especially particulate matter (PM) with a 2.5-micrometer size. PM 2.5 mainly comes from road dust, fuel burning, vehicles, and industrial plants. Constant breathing of this polluted air causes many health problems. Therefore, we need accurate and real-time monitoring systems for checking dust particle concentration levels periodically. The data that is generated on a periodic or time basis is called time series data, and these data are unstructured. Time series data, or unstructured data analysis is an important area in artificial intelligence. Different deep learning and machine learning methods are used for handling this data. The time series data processing includes missing data imputation, hyper-parameter optimization, normalization, etc. This paper mainly focused on various methods and algorithms that are used in time series data processing and the metrics used to evaluate its performance. This will help to develop new models for handling time series data and thus predicting air quality efficiently. From the analysis of existing methods, inferred that deep learning models like Long Short-Term Memory (LSTM) and Multi-Layer Perceptron (MLP) give better accuracy. Air quality prediction with Geospatial Artificial Intelligence (GeoAI) is a new and emerging area for future research.

Anju Augustin, Cinu C. Kiliroor
Deep Learning in the Context of Artificial Intelligence: Advancements and Applications

Leukaemia and childhood cancers are becoming increasingly prevalent in India, highlighting the urgent need for advancements in early detection methodologies. This research paper explores the application of deep learning (DL) and its techniques in examining blood cells to aid in the diagnosis of blood cancers. DL algorithms possess the ability to accurately classify and segment cells in blood smears, thereby facilitating early-stage cancer detection. The accuracy of cell segmentation has significantly improved due to the utilization of advanced DL techniques such as generative adversarial networks (GANs) (Wang et al. in Pattern Recogn Lett 141:122–128, 2021), which effectively address challenges like overlapping nuclei and morphological changes. Additionally, convolutional neural networks (CNNs), enhanced through transfer learning methods (Alakus and Akkus in Comput Biol Med 143:104551, 2022), have exhibited remarkable precision in classifying blood cancer cells. The primary aim is to combine progress in deep learning with a Clinical Decision Support System (CDSS) that quickly converges to provide accurate diagnoses and improve patient outcomes is the primary objective. The incorporation of federated learning models with explainable AI (XAI) (Saraswat et al. in IEEE Access 10:84486–84517, 2022), which ensures privacy and transparency in data processing, has bolstered trust in DL applications. The shift in DL from segmentation to classification signifies a paradigm shift in how India tackles cancer detection.

Arpana Chaturvedi, Nitish Pathak, Neelam Sharma, R. Mahaveerakannan
Public Universities in South-West Nigeria: Access and Use of Information Communication Technology by Lecturers and Students

The aim of this study was to ascertain the access and use of information communication technology by lecturers and the students in public universities in South-West Nigeria. A eloquent survey research plan was deployed in this study. Using a multi-stage sampling procedure, a sample of 1100 students and 1100 lecturers was chosen from a population of 11 public universities that are owned and funded by Government. A self-designed questionnaire titled: Access and Utilization of ICT Facilities (QAUIF), was used for data collection. Frequency counts, percentages, and Chi-square statistical tests were used to analyze the data collected. Result revealed that ICT facilities are not adequately available in virtually all the public universities and their knowledge and proficiency is very low. On the basis of the findings, recommendations were offered among which is it is therefore recommended that ICT laboratories fully equipped with updated ICT facilities should be established with adequate incentives to train a large number of students and lecturers to facilitate all-round the attainment of the goals of the university education.

Muyiwa Sunday Ajimuse, Vikash Yadav, Ayodeji Olorunfemi Olawole, Sarvachan Verma
Prognosis of Parkinson’s Disease Different Phases by Exploiting Deep Learning Models: Comparative Study

In the medical healthcare system, a significant amount of medical and research studies, different types of tests, imaging, prescription, and medicated data on vocal and hand tremors are being accumulated and stored tremendously. Most Parkinson’s disease (PD) sufferers are afflicted by vocal cord issues and hand tremors. Speech impairment and hand tremors are early signs of Parkinson’s disease detection. The purpose of this work is to discuss the application of deep learning algorithms to improve the detection of Parkinson’s disease (PD) phases utilizing acoustic data from vocal samples and handwriting samples. PD sufferers often experience vocal cord issues and hand tremors, making these symptoms crucial for early detection. The study uses various deep learning models, including RNN, CNN, and Vision Transformer, to cite the features from the collected data. The study makes use of an extensive collection of data from the UCI machine learning repository, which includes voice and hand tremor samples from 62 Parkinson’s disease patients and 15 healthy persons. The study used three sorts of recordings: static spiral tests, dynamic spiral tests, and stability tests. The goal is to identify the best algorithm using deep learning for early Parkinson’s syndrome detection by analyzing many modalities, including motor cardinal traits (bradykinesia, stiffness, tremor, axial indications) and mobility symptoms (gait, handwriting, speech, and EMG). The EfficientNetB5 model outperforms other models with an accuracy of 97.65%, while the Vision Transformer model shows promise as an alternative to CNN models but achieves an accuracy of 94.80%, lower than EfficientNetB5. The paper concludes by proposing new guidelines for further research on deep machine learning models for the automatic detection of PD, aiming to enhance early identification of PD globally and also address the challenges in adopting contemporary computer-aided diagnosis systems in health care.

Adimulam Raghuvira Pratap, Annamalai Suresh
Smart Solutions for Waste Water Overflow: A GIS Case Study in Karbala, Iraq

The sewerage system is an essential component of urban infrastructure. It is responsible for the occurrence of environmental pollution, when appears on the surface of the earth because of a flood resulting from a blockage or break in the pipe or the pipe not accommodating additional amounts of drainage. The future concerns the drainage system to avoid floods when the population increases suddenly during festivals or religious visits. This research presents a model based on geographical information systems to report the occurrence of a flood with a smartphone application that sends the number of the manhole causing the sewage overflow to the dashboard, so that the appropriate decision can then be taken by decision-makers to solve the problem. The city center of Karbala—Iraq—was taken as a case study because this city is visited by many visitors during the religious visit period. Various experimental operations of the model have shown that it allows real monitoring, helps in treating sewage overflows, and reduces the resulting environmental pollution.

Ihsan Kadhim Abed, Fadi Hage Chehade, Zaid F. Makki
Quantum-Enhanced Cyber Security Framework for E-Commerce Platforms

This manuscript introduces a novel Quantum-Enhanced Cyber Security Framework for E-commerce platforms, designed to address the emerging challenges posed by quantum computing to cyber security. The framework integrates quantum computing principles into key aspects of cyber security, offering a robust solution against both classical and quantum threats. Central to the proposed approach are three innovative components: quantum key distribution, a Quantum-Resilient Cyber Trust Verification Protocol, and a Distributed Quantum Ledger for Cyber Secure E-commerce. Simulations validate that the proposed approach significantly outperforms traditional security methods in key generation rate and error reduction, demonstrating its efficacy in a quantum-threatened cyber domain.

Fauziyah, Zhaoshun Wang, Mujahid Tabassum
Comparative Analysis of Malware Detection Response Times Across Android Versions: An Emphasis on the “Hoverwatch” Application

As Android-based smartphones become increasingly prevalent in our digital age, they also become inviting targets for malicious applications. Our research demystifies malware detection, with a special focus on response times across multiple Android versions. We began by examining the vast influence of Android and the need for robust malware detection in today’s market. To understand the threat landscape, we investigated application-based threats and their impact on users and provided an encompassing review of the current malware detection methodologies, which include static, dynamic, and hybrid techniques. Our study centres around comprehensive tests conducted on distinct Android emulators, with the intent to measure malware detection response time. We used a known malicious app, “Hoverwatch,” for this experiment. Our findings reveal notable disparities in detection times, emphasizing the need for constant advancements in defence systems and in a number of test cases the need for improvements in real-time detection capabilities. This research offers insights into the effectiveness of current Android malware detection methods, stressing response times. We underscore the necessity for regular updates and system enhancements to combat the evolving threat environment.

Chiemela Ndukwe, Elaheh Homayounvala, Hassan Kazemian, Istteffanny Araujo
Machine Learning for Management of Data: The Role of Machine Learning in Marketing Mix Modelling and Decision-Making

Marketers may use the power of advanced investigation and arrangement of data incorporating algorithms for learning into the modelling of the marketing mix (MMM). This combination increases the predictive power and accuracy of MMM models, enabling marketers to get a deeper understanding of the complex relationships that exist between marketing efforts and financial outcomes. Furthermore, machine learning (ML) provides dynamic, real-time analysis of marketing strategies, facilitating prompt decision-making and marketing resource allocation optimisation. In essence, machine learning (ML) offers the prospect of a more effective quantitative method to marketing investment selections because of precise measurement and optimisation. In the field of Internet marketing, automated learning is a cutting-edge strategy since it captures, assesses, as well as uses opinions and comments about businesses to determine the feelings associated with the brand. Marketers may use this data to tailor their marketing interactions so that they speak specifically to each prospective customer and increase sales of their items. Machine learning techniques help to enhance customer visits by categorising the different click-through reactions to businesses that interact with them. Understanding digital clients better is made possible by deep learning, which divides the massive daily data cache into several sectors and uses pattern analysis to create insights from it. Based on judgmental sampling, the study chose 1250 digital users in India to examine how machine learning affects various machine learning capabilities that deal with consumer behaviour, decision-making and emotions in online marketing. These days, machine learning and artificial intelligence are the two main digital technologies that are changing people’s lives. Machine learning has transformed the way value is produced in digital marketing. Customers have a lot of alternatives when it comes to digital platforms these days, and system intelligence may assist advertisers in offering ideal object to customers in a competitive market. One popular approach is learning by machines which has an impact on daily tasks. The literature that is now accessible indicates that additional investigation into the use of machine learning to advertise electronically is necessary. The developments in machine learning have opened up new opportunities for digital marketing firms. Potential for clients in the services industry, especially in digital marketing, is presented by this study. Amazon and Facebook are two examples of how machine learning is being used, and both examples might enhance digital marketing. Starting with individual connections, we go to unified teams of participants connections, book publication and ultimately creating an area for collaboration among employers and employees.

Meghna Chaudhary, M. Afshar Alam, Sherin Zafar
Ontological-Based GIS Approach for Assessment of Soil Pollutants

Despite the importance of the cement industry, it is considered one of the most environmentally polluting industries. The study aims to reveal the chemical and physical contamination of soil with pollutants and their spatial variation in the form of ontology. The representation of pollutants and their relation is depicted in the form of knowledge graphs commonly known as Ontology. With this, the information related to spatial data is retrieved in the form of ontology web language (OWL). The results showed that most of the soil samples collected from different areas of the study area is contaminated with heavy metals compared to the permissible standards for the concentration of these metals in the soil. The use of Geographic Information System (GIS) and ontology helps decision-making officials to develop policies and strategies to reduce environmental pollution in the areas surrounding these factories to protect environmental elements such as soil, air, and water in the nearby areas.

Hussien Mohson abide, Fadi Hage Chehade, Zaid F. Makki
Smartwatch as a Pervasive Computing Application in Health Metrics Tracking

Pervasive computing, characterized by the integration of technology into everyday life, has transformed the healthcare landscape. This research paper explores the significant role of smartwatches as a vital tool of pervasive computing in healthcare. Smartwatches have evolved into powerful health monitoring and management devices, providing continuous and unobtrusive tracking of health metrics such as heart rate, sleep patterns, physical activity, and more. They offer early detection of critical health events, personalized health insights, and motivation for lifestyle changes. Moreover, smartwatches contribute to telemedicine, telehealth, and remote patient monitoring, enhancing the accessibility and quality of healthcare services. This paper delves into the future scope of smartwatches in healthcare, highlighting advanced health monitoring, predictive health insights, personalized healthcare, and their potential impact on wellness and mental health. This paper also discusses the technologies, applications, key components, and benefits of pervasive computing in healthcare. The paper also gives us information about the benefits of smartwatches as an effective application of pervasive computing in healthcare.

Akshita Sah, Shishir Saurav, Aditya Meena, Sushruta Mishra, Naresh Kumar
Review of Various Neural Style Transfer Methods: A Comparative Study

NST or neural style transfer has revolutionized the field of image processing by allowing the amalgamation of artistic styles to photographs. First introduced by Gatys et al., NST relies on a slow and iterative optimization process. However, recent advances have introduced faster and more efficient approaches, such as adaptive instance normalization (AdaIN) and Johnson's method. Gatys's method, which laid the foundation for NST, uses a convolutional neural network (CNN) to extract information about the content of an image and artistic features or style of an image. Based on reducing the heterogeneity between features of content images and style images, this procedure although ultramodern, is tedious. By reconceptualized model normalization, AdaIN initiated an innovative technique for rapid amalgamation of content and style features from random images. It terminates the requirement of time-consuming optimization by focusing on real-time artistic shift with excellent mouldability. Johnson's technique involves a unique learning experience using sensory loss and previously learned interactions to demonstrate high-contrast tasks This allows for better preparation through the use of localized activities and in imagery on instantaneous analysis, resulting in a greater mixture of the two parts. This research studies the comprehensive insight into NST techniques and their evolution, highlighting potential approaches for image processing and the resolution method.

Akash Goel, Palak Singh, Ragini Rani, Kalash Jain
Mental Health Assessment Using EEG Sensor and Machine Learning

Heart failure stands as a significant public health issue with the high mortality and morbidity rates. Timely anticipation and detection of heart failure play a vital role in facilitating prompt intervention and improved prognoses. Recently, machine learning methodologies have emerged as auspicious tools. Machine learning approaches have emerged as promising tools for predicting and diagnosing heart failure in prognosticating and identifying heart failure cases. This survey paper presents an extensive overview of the extant body of literature concerning machine learning cantered strategies for the early prognosis and diagnosis of heart failure. We scout into exploration of diverse ML techniques that are frequently employed in heart failure prognosis and diagnosis, including artificial neutral network, random forest logistic regression, support vector machine models and deep learning. Furthermore, an examination of the datasets and evaluation metrics used to calculate the efficiency of machine learning models in prognosis and diagnosing heart failure is presented. Our investigation culminates in a synthesis of the principle discoveries compiled from the reviewed literature. We undertake a comparative analysis of performance exhibited by distinct ML technologies, while also addressing the obstacles and constraint inherent in that the utilization of machine learning for heart failure prognosis and diagnosis. By offering this survey article, we furnish valuable asset of researchers and medical practitioners who possess a wasted interest in elevating machine learning techniques for the early prognosis and diagnosis of heart failure.

Man Singh, Chetan Vyas, Bireshwar Dass Mazumdar
Leveraging the Synergy of Supply Chain Analytics, Visibility, Innovation, and Collaboration to Improve Environmental and Financial Performance: An Empirical Investigation

This study aims to develop a comprehensive framework for supply chain analytics and its outcomes. To this purpose, it has established the relationship among supply chain analytics capabilities (CSAC), supply chain visibility (SCV), supply chain collaboration (SCC), supply chain innovation (SCI), and stakeholder trust (ST) to achieve environmental performance (EP) and financial performance (FP). In this quantitative study, the deductive approach has been followed by the survey method to collect the responses from respondents. Data has been collected from the various business sectors with different firm sizes, operations, and countries. Moreover, the partial least square-structure equation modeling (PLS-SEM) method has been used to analyze the data. The findings of the study have supported the conceptual framework. The direct effects of SCAC on SCV, SCC, and ST were found significantly positive. Moreover, SCV on SCC and ST showed the same pattern. Additionally, indirect effects were found between SCAC, SCV, and SCC, and SCAC, SCV, and ST were found. The previous quantitative supply chain literature has focused the big data analytics and capabilities and big data predictive analytics, but they have ignored supply chain analytics.

Muhammad Noman Shafique, Elisabeth T. Pereira
Real-Time Investigation on Electromagnetic Radiation Exposure from Cell Tower

Non-ionizing radiation from the cell towers is considered as the major source of electromagnetic radiation (EMR) that exists in the outdoor environment. The unauthorized installation of the mobile towers in the populated areas creates health issues for living creatures like plants, animals, birds, and insects including human beings. The effects on each creature in the environment lead to the degradation of the living ecosystem. These reported effects on each of the living things are analyzed based on the absorption rate when the EMR is exposed to it. This work aims to analyze the rate at which EMR is exposed to the living community based on the field characteristics near and far from the mobile tower. The intensity of these can be determined by analyzing the variation in the electric (E) field distribution with close proximity to the identified mobile tower. The cell tower was selected on the location which is highly populated and includes educational institutions. Within the selected area, it can be observed that the vegetation has growth retardation and lacks seedlings with dry leaves and barks. These reported issues in the study area are due to the differences in the field characteristics examined by an Electrosmog meter (PCE-EM-29) at different locations from the selected cell tower. The absorption of EMR is influenced by the characteristics of the electric field, and variations in the absorption are the evidence of the reported issues in the nearby region from the tower. As the E-field in the nearby region increases, the absorption is also increased.

L. Meenu, S. Aiswarya, K. A. Unnikrishna Menon, Sreedevi K. Menon
Deep Cognitive Learning for Enhanced Pneumonia Detection: Employing CNNs for Precise Classification

When we talk about holistic pulmonic health, pneumonia still stands as a grave condition which is identified by inflammation of lungs. Consequently, the person undergoes a rise in body temperature, cough, and pain in the torso. It is vital that this condition is identified and treated upon in the fetal years as it can help in making remarkable recovery. This study provides us a model of deep neural data science based on convolutional neural networks. It helps in distinction of pneumonia from usual X-ray images. This model has proven to show impeccable exactness and highlight the strong points. The design and validation part is being constantly customized to identify symptoms of pneumonia separating it from conventional lung illnesses. The excellent accuracy rates paint a gripping canvas in the field of analysis of images for medical purposes. This design is especially vital in conditions involving pneumonia because immediate treatment works in correspondence with the betterment of the patient's well-being. This model in action will supply healthcare professionals with exact discovery. This expansion focuses on the transformational scope of including artificial intelligence into the medical industry to better patient’s well-being. Our study is an impactful step in positive direction toward healthcare with an agenda of having attainable alternatives for before time pneumonia identification.

Rishit Pandey, Archisa Singh, Vaibhav Kapoor, Sushruta Mishra, Shalini Goel, Rajeev Sobti
Object Counting from Images Using Deep Learning Technique

This article gives an overview of object counting problem and its usage. It also describes various ways of counting objects from still images and video streams. Object counting is an important task in machine learning. It helps to identify objects in an image, so that the number of objects can be determined. However, there are many challenges when it comes to object counting using deep learning techniques such as illumination, variation, occlusion, and real-time counting. We have also reviewed some of the recent papers to get an idea of current technology. At last through one example, we have discussed how object counting can be happened through deep learning techniques.

Arishpreet Kour Bali, Amit Kumar
A Survey on Predictive Modelling for Diverse Climate Condition and Heavy Rainfall

Predictive modelling is crucial for understanding and adapting to diverse climate and rainfall patterns, particularly in the context of climate change. In an effort to overcome these issues, the focus of this survey is on the use of machine learning approaches. For many types of sectors including agriculture, disaster prevention, and water resource management, accurate rainfall forecasts are crucial. The introduction of ML approaches is a result of the fact that traditional meteorological models frequently fall short in capturing complicated weather-related processes. In this study, a survey and analysis of the various architectures and ML models, together with their accuracy and performance outcomes, are discussed.

R. Logeswaran, S. Anirudh, M. Anousouya Devi
Analysis of Inclination Toward IT-Based Startups Among CSE Undergraduates

The startup ecosystem in India has been rapidly expanding, particularly in the past decade. A large and growing pool of tech talent, rising investment from domestic investors, and favorable government regulations are driving this expansion. Young entrepreneurs have ventured into every industry across the country. Apart from the highs and lows in the journey, startup culture in India values creativity, skills, dedication, horizontal hierarchy, and open communication more than just a degree. This paper focuses on the scope of startup ventures among budding engineers, especially in the IT sector. The dataset comprises responses collected from CSE engineering undergraduates in a third-tier city in India. The statistical analysis includes exploratory statistics, logistics regression, and a non-parametric Chi-square test. Interpreting all these tests can help build up the proper strategies for budding undergraduates to fulfill their dreams of starting a startup. The research study can support the education authorities in planning their respective approaches to boost startups’ ventures in the early stages of undergraduate education.

Raghav Uparkar, Sudhanshu Maurya, Satyajit Uparkar, Vrince Vimal, Monali Gulhane, Nitin Rakesh
A Cognitive Predictive Approach for Underwater Mine Detection

The presence of underwater mines can have serious consequences for ships and submarines, impeding their ability to navigate the vast oceans and seas. This issue demands urgent attention, especially since many mines go undetected for decades. It's important to mention that the expenses associated with creating and placing a mine typically range from 0.5 to 10% of the expenses involved in removing it. Furthermore, the process of removing mines can take up to 200 times longer compared to the time it takes to lay them. To tackle this problem, a range of mine countermeasure (MCM) techniques have been developed, including mine-hunting. This technique involves using Sonar signals to identify mine-like objects (MLO) and employing various classification methods to differentiate them from benign objects. In this research, we will compare the efficacy of classical machine learning models and the binary ANN classifier, which uses an encoder for feature extraction.

Danish Khan, Kumar Tejashwa, Sushruta Mishra, Hrudaya Kumar Tripathy, Naresh Kumar
An Investigation on Coral Reef Classification Using Machine Learning Algorithms

Due to growing challenges, coral reefs—one of the planet’s most ecologically varied and commercially significant ecosystems—need to be the focus of increased conservation efforts. This paper explores the rapidly developing field of machine learning applications for automatic categorization from underwater photography, as well as the changing terrain of coral reef classification. We emphasize the need of effective monitoring and conservation efforts because we acknowledge the critical role that coral reefs play in maintaining marine biodiversity and coastal protection. Machine learning approaches are explored in the context of growing concerns like habitat loss and coral bleaching, which are combined with biological importance. This study represents a paradigm leap in our understanding of and response to the complex dynamics of coral reefs, going beyond the recent advances in automation. Through the provision of real-time, nuanced insights about the composition and health of reefs, machine learning emerges as a lighthouse that illuminates the route toward successful reef management. The wider ramifications are significant, going beyond simplified procedures to radically change our approaches to conservation.

S. Nithish Karthik, M. Hariharasudhan, M. Anousouya Devi
Experimental Study to Analyze the Zonal Financial Independence of Women Using Fuzzy Logic

Gender inequality and discrimination against women are some of the social evils from which our society is suffering from decades. Financial literacy rate in women of India according to records is not good despite certain progress made in last few decades; women still face socio-economic discrimination in the society which hampers their emotional and financial independence. In this research paper, the authors analyze the financial status of women by using fuzzy logic and the data for the same is collected from the official site of government of Indian. During research, it was out found out that there is a significant difference in financial status of women in different zones of the country. The given study can help stakeholders and policymakers to acknowledge the actual position of women in different zones across the nation and develop targeted intervention for the upliftment of females in the Indian society.

Prisha Gupta, Yogesh Aggarwal, Sonakshi Vij, Divya Agarwal
Constructing and Processing 3D Face Structures Using Structure of Motion Without Complex Instruments

In the world of biometric authentication, 3D face recognition stands as a pivotal era. That’s why, this paper presents a groundbreaking method for building and processing 3D face structures that may later be used for superior and multidimensional face recognition. The method obviated the need for complicated scanners and heavy-powered graphic processing gadgets. The proposed approach uses a gaggle of 2D RGB images extracted from half of a minute video covering the frontal half of the face from one ear to another. Further, it uses the concept of structure from motion and photogrammetry, which enable the extraction of facial features in the form of point clouds. These extracted face structures are then processed, the usage of the iterative closest point method for shape aligning coupled with density-based spatial clustering of applications with noise for getting rid of noisy factors. This research not only signifies a significant leap in the domain of 3D face recognition but also has far-achieving implications. The ramifications of these structures enlarge past mere technological innovation, and it engenders realistic integration opportunities in ordinary eventualities, ranging from mobile devices to surveillance cameras. Consequently, the research contributes to organizing a more secure and extra secure digital panorama, improving the resilience of authentication systems and reinforcing the rules of virtual agreement.

Harshit Mittal, Trilochan Singh Rathore, Neeraj Garg
The Impact of Artificial Intelligence on Crisis Management

The research paper explores the ability of artificial intelligence (AI) to develop a response to crises. The research study focuses on the applications and solutions of artificial intelligence provided in response to crises. The use of AI satellites, AI-powered drones, and AI-rescue robots is highlighted. AI satellites that can be useful to identify severe zones and AI-powered drones that can help cover a large area, provide food and medicines, detect aromas, and allocate people are discussed in the proposed solution. Another efficient solution is AI-rescue robots equipped with cameras, sensors, and sonar to help rescue people, avoid obstacles, and coordinate with rescue teams to perform their tasks efficiently. Overall, the research paper also acknowledges the hurdles and effectively solves the problem. The solutions provided are promising opportunities that can help implement quick and effective decision-making, responses, and the ability to handle complex situations.

Arya Kashikar, Sudhanshu Maurya, Monali Gulhane, Vrince Vimal, Nitin Rakesh, Manish Kumar
Wheat Disease Detection Using YOLOv8 and GAN Model

Wheat leaf disease is a major concern in agriculture. It leads to significant crop yield losses. It is necessary to diagnose wheat leaf disease in its early stages to ensure food security and sustain global wheat production. The main objective of this paper is to present a different method for wheat disease detection using you only look once algorithm version 8 (YOLOv8) and generative adversarial networks (GANs). YOLOv8 is a famous object detection method which can detect and classify objects in real time accurately. It can process the images very quickly and accurately, thus making it an ideal choice for this task. One major problem is limited training data for various wheat diseases. To address this problem, in the proposed research, the authors have introduced a conditional-generative adversarial network (C-GAN)-based data augmentation technique which generates synthetic images of wheat leaves. This technique increases the volume of dataset to be used in training, thus improving the overall generalization of the model. The proposed model further compares the YOLOv8-trained model with other existing models. The proposed model achieves an accuracy of 99.8%, which is better than other models.

Dayal Rohan Volety, RamanThakur, Sushruta Mishra, Shalini Goel, Rachit Garg, Nagendar Yamsani
Customer Churn Rate Prediction Using Machine Learning Techniques for E-Commerce Sector

The e-commerce industry is rapidly growing in competition. The primary challenge lies in retaining customers through quality service and reasonable pricing. Predictive customer churn techniques can identify potential losses, allowing for improved marketing strategies. Meeting high demand and enhancing loyalty necessitate tailored services and strategies. Nevertheless, e-commerce customer churn is intricate, characterized by nonlinear fluctuations and asymmetrical customer types. Imbalanced data further complicates the scenario. The study offers proactive methods and models for online market places to reduce customer churn effectively. Classification is done using techniques machine learning models. This research innovates the churn prediction by employing a unique ensemble of machine learning models. Its distinctive features include in-depth exploratory data analysis (EDA), systematic model comparison, and the novel application of XGBoost as a unifying force. This approach sets the study apart, offering a comprehensive and advanced methodology for predicting churn rate of customers.

Muskan Saxena, Nikita Aggarwal, Rekha Gupta
Advancements in Vision-Based Deep Learning Techniques for Enhancing Quality Inspection in Submersible Pump Impellers

A significant factor and essential operation that ensures submersible pump impellers are inspected for quality to ensure top performance and reliability in different applications. The following paper discusses on how the techniques of computer vision and deep learning facilitate the ability to identify defects of the pump impellers in a quality control unit of the industry. The technical review, in this case, aims at providing a broad survey on a variety of deep and transfer learning approaches to quality control in manufacturing. The purpose of the study is to explore different approaches in identifying and classifying defects of impellers using an open dataset containing 7348 casting manufacturing top-view images. The goal is to explore and evaluate various recent implementations in computer vision and deep learning and arrive at the best approach used to detect defects. A comparative analysis of different novel techniques is also done as a part of this research, and the insights gained from this study can optimize industrial quality control techniques, aligning them with the industrial standards.

Judeson Antony Kovilpillai, K. C. Krishnachalitha, Puneet Kumar Yadav, K. Lalli, S. Jayanthy, Soumi Dhar
Federated Learning for Personalized Tourism Promotion: Balancing Recommendation Accuracy and User Privacy

Deep learning allows the recommender system to discover deep patterns and representations from data, capturing complex interactions between users and objects across multiple domains. Deep learning manages massive amounts of data and produces reliable representations to have effect on the quality of cross-domain recommender system. We utilize a federated learning algorithm to ensure that each domain retains control over its data and contributes to the global improvement of the suggestion, while also avoiding the sharing of sensitive information. To create major scenarios, privacy concerns are restricted to sharing the data. We recommend a novel model for personalizing the recommendation system by embedding the techniques of Federated Learning in tourism promotion. This model enables collaborative training across multiple domains to provide diverse recommendations and secure data.

S. Amutha, P. Salini
Empowering Real-Time Communication: A Seamless Chatting System Using Websocket

Real-time chatting systems are at an all-time high in terms of popularity. They allow the users to communicate among themselves instantly. The aim of this paper is to present a solution to implement a real-time chatting system in a feasible and applicable environment. This paper utilizes components such as Websockets using socket.io in Node.js. Websockets are a modern technology which enables the server and client to establish a continuous connection for real-time communication, and socket.io is a JavaScript library which provides a high-level abstraction of Websockets. In conclusion, this research explores the vast features of Websocket technology in the field of real-time communication. This paper advocates for the widespread adoption of Websocket-based chatting systems, bringing closer the new era of seamless interaction in the modern age.

Kaiwalya Deshpande, Ayush Jain, Abhinav, Sushruta Mishra, Shalini Goel, Rachit Garg
An Efficient Framework for Student’s Club Recommender System Using Machine Learning Models

In academic settings, scholars routinely encounter challenges in finding appropriate exercises for coeducational learning. The courses specialize in the Club Recommendation System (CRS) which aims to provide group counseling tailored to individual preferences using the Factorization Machines (FM) model. These are the extraordinary student attractions and loads of pictures to look at, or the dynamic mini club environments that act as serious issues that one must experience to strive for a premiership. However, it is a very difficult project to think through the complexity of preferences and membership issues; fortunately, the device is very rich with system learning techniques and factorization techniques that solve one's desire problems in models. The gift CRS after going through the facts processing with specific mission and matrix factorization will automate its best later to individualize group recommendations for each student. Through this approach, there will be a large community of students with diverse students interacting with many groups everywhere.

Jayanth Gurajada, R. Anu Keerthi, G. Santhandeep, S. Sandosh
Implementing IoT for Energy-Efficient Smart Street Light Management

Every city needs street lights because they provide improved night vision, safer roadways, and access to public spaces. Subsequently, street lights consume a substantial proportion of electricity. Even with sufficient lighting, the traditional street light system provides illumination from sunset to the morning with the utmost brightness. By automating light shutoff, this power waste can be eliminated. Other uses, such as residential, business, and transportation, can effectively make use of the energy. A street light control system with Internet of Things (IoT) capabilities and a solar panel can accomplish this. The primary objective of the smart street light management system (SSMS) employing IoT is to conserve energy by minimizing power consumption, lowering the need for personnel, and saving lives from accidents. SSMS represents one of the intelligent programmes that necessitate significant energy expenditures when constructing smart city infrastructure. Indisputable price reductions for street lighting are possible with smart management of the lights. The upsurge in importance given to energy conservation and routine maintenance has led to the emergence of cutting-edge technologies that permit significant energy savings, the utmost care for the environment, and a decrease in traffic accidents. This research paper's main objective is to describe the significance of a system for energy-saving street lights and to proceed towards creating smart and intelligent streets.

Kriti Jaiswal, Syed Anas Ansar, Amrendra Kumar Sharma, Mohd Asim Sayeed, Nupur Soni
Explainable Artificial Intelligence (XAI) in Critical Decision-Making Processes

Methodological advancement based on Artificial Intelligence (AI) has developed dramatically during the past few years in various sectors. The majority of these models are intrinsically complex and lack explanations of the decision-making process, hence the moniker “Black-Box”. Explainable AI (XAI) has emerged with significant implications for addressing the opacity of conventional black-box models. With a focus on the inscrutable AI, this study investigates the terrain of critical decision-making procedures. Commencing with a little background context, an analysis of existing techniques like Local Interpretable Model-Agnostic Explanations (LIMEs) and Shapley Additive exPlanations (SHAP) is presented. Concrete applications stem from insightful case studies that show XAI's revolutionary benefits in a variety of real-world settings. Moreover, to accentuate the delicate balance between privacy concerns and transparency, the ethical repercussions of employing XAI are examined. This work concludes with a discussion on how crucial it is for XAI to strengthen the foundation of accountability and trust in the vital decision-making process.

Swati Arya, Shruti Aggarwal, Nupur Soni, Neerav Nishant, Syed Anas Ansar
Impact of Digital Media and Digitalized Transformation on Talent Acquisition

Digitalized channels for recruitment have become widespread due to technological advances. Candidates have the most influence over the hiring process in this day and age. Because of the increasingly competitive environment in which businesses operate, recruiters need to attract the appropriate people. The ability to connect personally and professionally is essential for recruiters to have in the form of a necessary, relevant, and digitalized suitable social networking platform. The research evaluates LinkedIn's effectiveness as a digitalized media platform by analyzing the information content and the website's ease of use. A survey with a predetermined format was sent to a sample of 150 recruiters in Chandigarh as part of the research that was carried out there. Utilizing factor analysis, we discovered many characteristics of LinkedIn that contribute to its widespread use by recruiters and hiring teams. The statistical techniques of correlation and regression made it possible to investigate the factors influencing a person's perception and willingness to use LinkedIn. According to research results, perceived utility and relevance information are the two factors that significantly influence recruiters’ intentions to use LinkedIn.

Sonika, Ashita Chadha
An Exploration of the Factors Affecting Work-Life Balance: A Study on Women Police Personnel in India

Throughout the course of history, the pivotal position occupied by women in community has been crucial in fostering stability, facilitating progress, and fostering sustained growth. In every community, women assume the primary responsibility for the care of children and elders. In the present-day society, women have made significant strides in terms of professional advancement. They are maintaining close proximity with individuals of the opposing gender. In the Indian context, a significant proportion of economic endeavors are undertaken by women, including several domains such as family and care giving responsibilities, agricultural pursuits, educational endeavors, industrial activities, banking operations, and service sector engagements. The primary objective of this research is to ascertain the determinants that dominance the maintenance of an optimal work-life balance among female police personnel in India. In order to achieve this goal, a sample of two hundred and ten (210) female police personnel was randomly picked from various police stations in the states of Haryana and Punjab, India. The collected data were subjected to analysis using several statistical techniques, including validity and reliability checks, multiple regression analysis, analysis of variance, and test of hypothesis. The data that has been evaluated has been presented in tabular format. This study reveals a significant influence of both internal and external influences on the work-life balance of female police personnel in India. The significant influence of a heavy workload in workplace on job stress, productivity and job satisfaction among police personnel underscores the necessity for implementing policies that promote work schedule flexibility, support at workplace, and balance of work and life. According to this research, the adoption of a work-life balance (WLB) policy is of utmost importance for institutions in order to foster a positive working environment for female police personnel.

Kavita Kumari, Rupali Arora
A State-of-the-Art Review of Deep Learning-Based Object Detection Methods and Techniques

To locate and recognize things in a picture or a video is the aim of object recognition. Before the development of deep learning, object recognition required several processes. Many commonplace applications, including security systems, video surveillance systems, self-driving cars, and guides for the blind and visually handicapped, are built on deep learning and object detection. This paper offers a comprehensive review of deep learning-based item identification approaches, covering various object detection methodologies, deep learning-based object recognition frameworks, and specific model attributes. In addition to several other models like VGGNet, ResNet, DenseNet, and AlexNet, this paper also covers the construction and operation of the Convolutional Neural Network (CNN). The underlying history of object detection is also covered in this essay, which focuses on object recognition applications such as face and human detection, weapon recognition, and pedestrian identification. A detailed summary is also presented that may help to serve further enhancement in the related domain of research.

Chhaya Gupta, Nasib Singh Gill, Preeti Gulia
Comparison of SPV System Performance with DAST System Using MPPT Algorithms

The solar panel has low rate of conversion of energy due to different factors such as temperature around the panel, atmospheric conditions and shadows on the solar panel, dirt, ice, inverter efficiency and battery efficiency. The factors mentioned above reduce the efficiency of the panel, i.e., the power delivered by the panel does not meet the desired requirements. In addition to this, another drawback is the solar irradiance, i.e., the intensity of sun light per respective area of the panel. The concert of the solar tracker is more competent, if the extreme intensity of sunlight is fascinated. Dual Axis Solar Tracking Systems (DASTS) are proposed to overwhelm this faultiness. The Dual Axis Tracker (DAT) will give greater efficiency when related to Single Axis Tracker (SAT). The tracker energetically tracks the sun and deviations its locus consequently to get exhaust the possibilities output power. The solar panel can be utilized to its maximum potential with a charge regulator known as Maximum Power Point Tracker (MPPT). The proposed system is attentive on numerous Maximum Power Point (MPP) control procedures to magnet the extreme power of the solar array. The proposed system comprises the comparison between P&O algorithm and IC algorithm. The comparison between the dual axis solar tracker with and without MPPT algorithms will be observed using simulation and hardware in the proposed system.

Paruchuri Chandra Babu Naidu, R. Jayashree, S. Rajasekaran, J. N. Swaminathan
Deciphering Fitness Application Data Using Machine Learning

This paper predicts fitness application data of people using two machine learning techniques, linear regression and decision trees. Fitness Tracker collects data pertaining of physical activities such as steps, distance, calories burnt, sleep routine, etc. This paper explores the correlation between the aforementioned physical activities to find out which of the following affects calories burnt the highest. Comparison is done among two popular machine learning algorithms to depict their performance, interpretability, scalability, and applicability to the different datasets. This allows for us to maximize efficiency by reducing the collection of unnecessary data and further discuss suitable machine learning algorithms to implement in fitness devices for better accuracy in readings from fitness applications.

Sagar Puniyani, Dhruv Girotra, Divya Agarwal, Deepali Virmani
Comparison of Perplexity Scores of Language Models for Telugu Data Corpus in the Agricultural Domain

The agricultural domain has a lack of readily available resources, especially in regional languages. For a country like India, which majorly relies on agriculture as its main source of GDP and employment, it becomes vital to develop a corpus that spans across multiple topics of this domain. Post-collection of data, there must be a language model (LM) that can be implemented to assess the use of this data collected. Perplexity is a measure of how well a probability distribution model can predict a sample. Based on the lowest perplexity score of all models, we determine which LM performs the best. This paper compares three different LMs—n-gram, LSTM and Transformers. The perplexity of LSTM and Transformers was found to be 23.127 and 12.3 respectively, on the Telugu language dataset that was built by collecting data via web-scraping of links on the internet. The alignment of the theoretical knowledge and observed results of perplexity scores validates that the prepared Telugu agricultural dataset can be used for further NLP applications.

Pooja Rajesh, Akshita Gupta, Praneeta Immadisetty
New Post-quantum Crypto-algorithm Utilizing Hash Function: Applicable in Blockchain

Hash-based cryptosystem is most active research area in cryptography. Hash-based cryptosystem provides more security and authenticity because hash functions are irreversible or one way function, so no one can determine key of any hash-based cryptosystem. Motivation: Quantum computers, with their unprecedented processing power, have the potential to solve complex mathematical problems at speeds that could render traditional cryptography systems obsolete. This looming threat has spurred the development of post-quantum cryptography, aimed at creating algorithms resistant to attacks from quantum computers. Methodology: Hash functions have long been a fundamental building block in cryptographic systems. They transform input data into a fixed-size string of characters, making them integral to various security protocols. In the context of post-quantum cryptography, researchers are exploring new ways to harness the power of hash functions to create robust encryption schemes. In this article we discuss about research progress of Hash-based cryptosystem and propose a new hash-based crypto-algorithm which can be considered as post-quantum candidate in today’s cryptography world.

Namita Tiwari, Shashank Dwivedi, Sonika Singh
Non-invasive Glucometry: A New Frontier in Blood Sugar Monitoring

Diabetes is characterized by insulin resistance, which can cause issues if undetected and mistreated. Regular monitoring of blood glucose levels is necessary to prevent complications. Conventional diabetes checkup relies on invasive blood punctures that can cause calluses and spread infections if not handled meticulously. To address issues of diabetic patients such as painful insertions and discomfort associated with it, recently much work has been done in the field of non-invasive glucometers. The purpose of this research is to develop a near-infrared (NIR at 950 NM) blood glucose monitoring system that is non-invasive and affordable. The device will employ an In-Vitro glucose measurement prototype that detects diffuse reflectance spectra of blood from human fingers and forearms to ensure sensitivity in fluctuating glucose concentrations at a very low cost.

Himanshu Lohia, Rishi Singhal, Shlok Bhardwaj, Divya Agarwal, Deepali Virmani
Smart Irrigation and Monitoring Framework

Agriculture has a crucial role in order to develop an agricultural country. In every country, it is known as backbone of the economy. In a study it is found that almost 70% of the country people depend upon farming. As we know shortage of water resources is the most serious problem not only for current but for the upcoming generation too. Smart agriculture is the solution for this issue as it includes modernization of traditional methods used in agriculture. The paper highlights the smart and automatic ways to control supply of water to fields according to the particular yield. Sensor with subordinate circuitry is used to sense the soil moisture. The sensed data is pushed to the cell phone of the user using SMS with through GSM module.

Rakesh Kr. Arora, Manoj Kr. Gupta, Suraj Pal Singh
Blockchain-Enabled Traceability Frameworks and Prospects for Dairy Supply Chain Management

Traditional food supply chains, by their very nature, are centralized and raise a number of concerns. These issues encompass a lone point of breakdown, issues related to product consistency, compromises in quality, and potential data loss. Because of the expanding consumer base and the complicated dairy supply chain have limited knowledge about the products manufactured or handled by manufacturers and handlers. This information imbalance within the dairy industry raises significant issues regarding challenges pertaining to welfare, ecological sustainability and health for individuals. This underscores the critical necessity for an upgraded decentralized supply chain model. Given this context, a strong dairy supply network is required to satisfy consumers’ information needs and grow their confidence in dairy products they consume. This article attempts to show how blockchain technology can be used in the dairy industry. This paper presents a study of the existing blockchain-based frameworks for dairy supply network. The study explores the integrated technologies in the frameworks along with blockchain technology and the various functionalities provided by the frameworks.

Deepti Sharma, Gurpreet Singh, Amanpreet Kaur
A Systematic Analysis of Diverse Large Language Models and Their Operational Paradigm

LLMs (Large Language Models) generated texts (e.g. Texts generated by Chat GPT) and its use has been growing rapidly where any language-related problems can be solved or any queries based on language translation can be answered easily. Some of the most well-known LLMs include OpenAI’s GPT models (GPT1, 2, 3.5, 4), Google’s BARD, BERT, Facebook’s RoBERTa and so on. Since natural language text is generated by such LLMs, it has the several possible issues associated with it. For example, our creativity will be faded away as all the ideas, codes and solutions are generated by these models. Therefore, accurate and efficient classifier tool is necessary to be formulated and implemented. Before developing a classifier tool, review of various LLMs will be done so that actual working of the large language models can be identified and used for further analysis of classifier model. LLM research has recently made tremendous strides in both academia and business industry, with ChatGPT’s introduction—a potent AI chatbot built on LLMs being a noteworthy milestone received a great deal of public interest. LLMs’ technical development has had a significant impact on AI community, which have fundamentally altered how we create and employ AI systems. Given this rapid advancement in technology, we evaluate current developments in LLMs in this survey by explaining the backdrop, major findings, and mainstream techniques. Pre-training, adaptation adjustment, use, and capacity evaluation—the four core LLM components—are the ones we focus on. We also discuss the difficulties that still need to be overcome in order to advance future advances, as well as the resources that are available for developing LLMs. For both academics and engineers, this study provides a current review of the LLM literature.

Omkar Bhattarai, Raj Chaudhary, Rahul Kumar, Ali Imam Abidi
Distributed Resource Allocation for V2V Communications in D2D Networks Using MapReduce-Based Ant Colony Optimization

Vehicle-to-Vehicle (V2V) technology-based Device-to-Device (D2D) communication has become a potential paradigm for enhancing the effectiveness and quality of wireless network connection. However, because of the dynamic nature of the network, resource limitations, and the requirement for efficient use of network resources, resource allocation for V2V communication in D2D networks is a difficult topic. The distribution of resources for vehicle-to-vehicle (V2V) communications and quality of service (QoS)—including data rate, sum time, and average interference is problem. For V2V communications in D2D networks, a distributed resource allocation approach based on MapReduce and Ant Colony Optimization (ACO) is described in this paper. The suggested method takes into account a number of network factors, including interference, sum time, and data rates, in order to improve resource allocation. An effective metaheuristic approach method for resource allocation optimization of network aggregation rate while maintaining (QoS) standards is the ACO swarm intelligence algorithm. In comparison to alternative methods, the simulation results demonstrated a notable increase in resource allocation performance. Data transfer rates were improved by 5% and reached (77,832) bits compared to 30 pairs of D2D during a sum time period of (1 ms), and the average interference also decreased (− 63.5). The less interference, the greater, it increases the probability of resource allocation.

Ali Ayad, Fahad Ghalib Abdulkadhim
Development of Decentralized Project Funding Application Using Polygon Blockchain

The advent of blockchain technology has opened up new possibilities for decentralized finance (DeFi) applications, revolutionizing traditional methods of project funding. This research paper explores the development of a decentralized project funding application utilizing the Polygon blockchain, aiming to address the challenges and limitations of existing project funding models. Incorporating exclusive Polygon features, the application is a secured, open, and effective way of project financing that utilizes this low-cost blockchain solution which has high performance. This application allows project creators to engage with sponsors directly without using middlemen by incorporating smart contracts and decentralized governance mechanisms hence it seeks to create trust in the ecosystem. This study mainly analyses the technical aspects of constructing an app like this which include programming language choices, smart contract development and integration within the Polygon network. Furthermore, through research this paper also looks at what would be gained if organizations involved in project funding made use of decentralized methods such as ease of access, less transaction costs and better investment allocation. The findings from this research contribute toward the growing body of literature on blockchain technology applications and provide valuable insights for future research and implementation on decentralized project financing solutions.

R. Nithin Rao, M. S. Supriya, R. Rohit Kumar, Mohammed Kaif, T. B. Samarth, M. Naveen Kumar
Analysis of Random Forest Compared to Ridge Linear Classification for Handwritten Alphabet Recognition

Aim: The suggested goal consists of analyze handwriting alphabet recognition employing Ridge Linear classification as opposed to Random Forest. Materials and Methods: The chosen dataset, which is divided into 80 and 20%, is used for the analysis. Twenty % of the data is used for testing procedures, while 80% is used for training. Two different models like Ridge Linear classifier and Random Forest were developed this time phase, and experimental analysis is carried out. To compute SPSS, a G power of 0.95 is utilized. Result: The mean accuracy of the Ridge Linear classification method, which is the recommended approach, is 93%, which is greater than the 87.42% of the conventional method. The obtained results show that there is no statistical significance difference between the Novel Ridge Linear and Random Forest with p = 0.144 (test on an independent sample, 0.05). Conclusion: The proposed algorithm performed better than the conventional method of Handwritten Alphabet Recognition.

J. Prakash, P. Dass
Backmatter
Metadaten
Titel
Innovative Computing and Communications
herausgegeben von
Aboul Ella Hassanien
Sameer Anand
Ajay Jaiswal
Prabhat Kumar
Copyright-Jahr
2025
Verlag
Springer Nature Singapore
Electronic ISBN
978-981-9741-52-6
Print ISBN
978-981-9741-51-9
DOI
https://doi.org/10.1007/978-981-97-4152-6