Skip to main content

2024 | Buch

Internet of Things. Advances in Information and Communication Technology

6th IFIP International Cross-Domain Conference, IFIPIoT 2023, Denton, TX, USA, November 2–3, 2023, Proceedings, Part I

insite
SUCHEN

Über dieses Buch

This book constitutes the refereed post-conference proceedings of the 6th IFIP International Cross-Domain Conference on Internet of Things, IFIPIoT 2023, held in Denton, TX, USA, in November 2023.

The 36 full papers and 27 short papers presented were carefully reviewed and selected from 84 submissions. The papers offer insights into the latest innovations, challenges, and opportunities in IoT, covering a wide array of topics, including IoT architectures, security and privacy, data analytics, edge computing, and applications in various domains.

Inhaltsverzeichnis

Frontmatter

Artificial Intelligence and Machine Learning Technologies for IoT (AMT)

Frontmatter
An Optimized Graph Neural Network-Based Approach for Intrusion Detection in Smart Vehicles

Due to recent developments in vehicle networks (VNs), more and more people want their electric cars to have access to sophisticated networking features and perform a variety of advanced activities. Several technologies are installed in these autonomous vehicles to aid the drivers. Cars have unquestionably become smarter as we’ve been able to install more and more gadgets and applications on them. As a result, the security of assistant and self-drive systems in automobiles becomes a life-threatening concern, since hostile attacks that cause traffic accidents may infiltrate smart cars. This research provides a technique based on the Graph Neural Network (GNN) deep learning (DL) model for detecting intrusions in VNs. This model can recognize attack patterns and classify threats accordingly. Experimentation utilizes car-hacking datasets. These files include DoS attacks, fuzzy attacks, driving gear spoofing, and RPM gauge spoofing. The car-hacking dataset is converted into image files, and the GNN model provided works with the newly produced image dataset. The findings of comparing our model to DL models indicate that our GNN model is superior. In specific, the test case scenarios may identify abnormalities with respect to detection F1-score, recall, precision, and accuracy to ensure accurate detection and identification of potential false alarm concerns.

Pallavi Zambare, Ying Liu
Evaluation of the Energy Viability of Smart IoT Sensors Using TinyML for Computer Vision Applications: A Case Study

TinyML technology, situated at the intersection of Machine Learning, Embedded Systems, and the Internet of Things (IoT), presents a promising solution for a wide range of IoT domains. However, achieving successful deployment of this technology on embedded devices necessitates optimizing energy efficiency. To validate the feasibility of TinyML on embedded devices, extensive field research and real-world experiments were conducted. Specifically, a TinyML computer vision model for people detection was implemented on an embedded system installed in a turnstile at a Federal Institute. The device accurately counts people, monitors battery levels, and transmits real-time data to the cloud. Encouraging results were obtained from the prototype, and experiments were performed using a lithium battery configuration with three batteries in series. Hourly voltage consumption analysis was conducted, and the findings were illustrated through graphical representations. The camera sensor prototype exhibited a consumption rate of 1.25 V per hour, whereas the prototype without the camera sensor displayed a more sustainable consumption rate of 0.93 V per hour. This field research contributes to advancing TinyML applications and enriching studies regarding its integration with IoT and computer vision.

Adriel Monti De Nardi, Maxwell Eduardo Monteiro
Simulated Annealing Based Area Optimization of Multilayer Perceptron Hardware for IoT Edge Devices

The deployment of highly parameterized Neural Network (NN) models on resource-constrained hardware platforms such as IoT edge devices is a challenging task due to their large size, expensive computational costs, and high memory requirements. To address this, we propose a Simulated Annealing (SA) algorithm-based NN optimization approach to generate area-optimized hardware for multilayer perceptrons on IoT edge devices. Our SA loop aims to change hidden layer weights to integer values and uses a two-step process to round new weights that are proximate to integers to reduce the hardware due to operation strength reduction, making it a perfect solution for IoT devices. Throughout the optimization process, we prioritize SA moves that do not compromise the model’s efficiency, ensuring optimal performance in a resource-constrained environment. We validate our proposed methodology on five MLP benchmarks implemented on FPGA, and we observe that the best-case savings are obtained when the amount of perturbation (p) is 10% and the number of perturbations at each temperature (N) is 10,000, keeping constant temperature reduction function ( $$\alpha $$ α ) at 0.95. For the best-case solution, the average savings in Lookup Tables (LUTs) and filpflops (FFs) are 24.83% and 25.76%, respectively, with an average model accuracy degradation of 1.64%. Our proposed SA-based NN optimization method can significantly improve the deployment of area-efficient NN models on resource-constrained IoT edge devices without compromising model accuracy, making it a promising approach for various IoT applications.

Rajeev Joshi, Lakshmi Kavya Kalyanam, Srinivas Katkoori
Machine Learning-Based Multi-stratum Channel Coordinator for Resilient Internet of Space Things

Sensing and transferring data are critical and challenging tasks for space missions, especially in the presence of extreme environments. Unlike terrestrial environments, space poses unprecedented reliability challenges to wireless communication channels due to electromagnetic interference and radiation. The determination of a dependable channel for exchanging critical data in a highly intemperate environment is crucial for the success of space missions. This paper proposes a unique Machine Learning (ML)-based multi-stratum channel coordinator in building the Resilient Internet of Space Things (ResIST). ResIST channel coordinator accommodates a lightweight software-defined wireless communication topology that allows dynamic selection of the most trustworthy channel(s) from a set of disparate frequency channels by utilizing ML technologies. We build a tool that simulates the space communication channel environments and then evaluate several prediction models to predict the bandwidths across a set of channels that experience the influence of radiation and interference. The experimental results show that ML-prediction technologies can be used efficiently for the determination of reliable channel(s) in extreme environments. Our observations from the heatmap and error analysis on the various ML-based methods show that Feed-Forward Neural Network (FFNN) drastically outperforms other ML methods as well as the simple prediction baseline method.

Md Tajul Islam, Sejun Song, Baek-Young Choi
A Schedule of Duties in the Cloud Space Using a Modified Salp Swarm Algorithm

Cloud computing is a concept introduced in the information technology era, with the main components being the grid, distributed, and valuable computing. The cloud is being developed continuously and, naturally, comes up with many challenges, one of which is scheduling. A schedule or timeline is a mechanism used to optimize the time for performing a duty or set of duties. A scheduling process is accountable for choosing the best resources for performing a duty. The main goal of a scheduling algorithm is to improve the efficiency and quality of the service while at the same time ensuring the acceptability and effectiveness of the targets. The task scheduling problem is one of the most important NP-hard issues in the cloud domain and, so far, many techniques have been proposed as solutions, including using genetic algorithms (GAs), particle swarm optimization, (PSO), and ant colony optimization (ACO). To address this problem, in this paper one of the collective intelligence algorithms, called the Salp Swarm Algorithm (SSA), has been expanded, improved, and applied. The performance of the proposed algorithm has been compared with that of GAs, PSO, continuous ACO, and the basic SSA. The results show that our algorithm has generally higher performance than the other algorithms. For example, compared to the basic SSA, the proposed method has an average reduction of approximately 21% in makespan.

Hossein Jamali, Ponkoj Chandra Shill, David Feil-Seifer, Frederick C. Harris Jr., Sergiu M. Dascalu
Layer-Wise Filter Thresholding Based CNN Pruning for Efficient IoT Edge Implementations

This paper presents a novel approach for efficiently running convolutional neural networks (CNNs) on Internet of Things (IoT) edge devices. The proposed method utilizes threshold-based pruning to optimize pre-trained CNN models, enabling inference on resource-constrained IoT and edge devices. The pruning thresholds for each layer are iteratively adjusted using a range-based threshold pruning technique. The pre-trained network evaluates the accuracy of the pruned model and dynamically adjusts the pruning thresholds to maximize accuracy. The effectiveness of the proposed approach is validated on the widely-used LeNet benchmark network, with MNIST, Fashion-MNIST, and SVHN datasets. Our experimental results show that for the MNIST dataset, we can prune 62–64% of weights for an accuracy loss of 1–4%. Similarly, for Fashion-MNIST, we can prune around 64% for an accuracy loss of around 2.92%, and for the SVHN dataset, we can prune around 55% of weights for an accuracy loss of 1.7% on average, saving resources.

Lakshmi Kavya Kalyanam, Rajeev Joshi, Srinivas Katkoori

Energy-Aware Security for IoT (EAS)

Frontmatter
Shrew Distributed Denial-of-Service (DDoS) Attack in IoT Applications: A Survey

With the rise of IoT and cloud computing, DDoS attacks have become increasingly harmful. This paper presents a survey of techniques for detecting and preventing DDoS attacks, specifically focusing on Shrew DDoS or low-rate DDoS attacks. We explore the use of machine learning for DDoS detection and prevention and introduce a new potential technique that simplifies the process of detecting and preventing DDoS attacks originating from multiple infected machines, typically known as zombie machines. As a future direction, we discuss a new technique to simplify the detection and prevention of shrew DoS attacks originating from multiple infected machines, commonly known as botnets. The insights presented in this paper will be valuable for researchers and practitioners in cybersecurity.

Harshdeep Singh, Vishnu Vardhan Baligodugula, Fathi Amsaad
Honeypot Detection and Classification Using Xgboost Algorithm for Hyper Tuning System Performance

The purpose of this research paper is to detect and classify the hidden honeypots in Ethereum smart contracts. The novelty of the work is in hypertuning of parameters, which is the unique addition along with classification. Nowadays, blockchain technologies are the grooming technologies. In the current trend, the attackers are implementing a new strategy that is much more proactive. The attackers attempt to dupe the victims by sending seemingly vulnerable contracts containing hidden traps. Such a seemingly vulnerable contract is called a honeypot. This work aims to detect such deployed honeypots. A tool named Honeybadger has been presented. It is a tool that uses symbolic execution to detect honeypots by analyzing contract bytecode. In this system, we consider different cases such as fund movement between the contractor and contract, the transaction between sender and participant, and several other contract features in terms of source code length and compilation information. In the methodology used, the features are then trained and classified using a machine learning algorithm (XGBoost and gradient boosting with hyper tuning) into Balance Disorder, Hidden State Update, Hidden Transfer, Inheritance Disorder, Skip Empty String Literal, Straw Man Contract, Type Deduction Overflow, and Uninitialized Struct. Through this algorithm, we developed a machine-learning model that detects and classifies the hidden honeypots in Ethereum smart contracts. Hypertuning of parameters is the unique addition along with classification that separates the rest of the studies done in this area.

Vinayak Musale, Pranav Mandke, Debajyoti Mukhopadhyay, Swapnoneel Roy, Aniket Singh
Electromagnetic Fault Injection Attack on ASCON Using ChipShouter

Electromagnetic fault injection (EMFI) is a deliberate technique used to induce faults in a device by exposing it to electromagnetic interference. ASCON is a lightweight cipher that offers better performance than other ciphers, making it suitable for IoT devices with limited resources. However, the use of lightweight ciphers on hardware devices can pose a significant security risk against EMFI attacks, which can manipulate both the device’s behavior and the implemented encryption algorithms. Our research used the ChipShouter, a specialized tool designed specifically for EMFI attacks on electronic devices. During these attacks, we intentionally exposed the M5STACK ESP32 Timer Camera (OV3660) module, on which we implement the ASCON algorithm, to electromagnetic pulses emitted by the ChipShouter. These pulses were directed at the PSRAM of the target device, where essential values such as plaintext, associated data, nonce, key, etc., are stored. Through the introduction of these pulses, we successfully inject faults and demonstrate the vulnerability of ASCON to EMFI attacks. To evaluate the impact, We test with different string sizes for input plaintext, namely 250 Kb, 500 Kb, and 1 MB. The results revealed that the fault injection percentages were as follows: 24% for the 250 Kb string size, 54% for the 500 Kb size, and 90% for the 1 MB size.

Varun Narayanan, Sriram Sankaran

Edge AI for Smart Wearables (EAW)

Frontmatter
A Smart Health Application for Real-Time Cardiac Disease Detection and Diagnosis Using Machine Learning on ECG Data

Cardiac disease, also referred to as cardiovascular disease, is a collection of conditions that affect the heart and blood vessels. Medical professionals typically use a combination of medical history, physical examination, and various diagnostic tests, such as electrocardiograms (ECG/EKG), echocardiograms, and stress tests, to diagnose cardiac diseases. In response to this issue, we are introducing a mobile application that continuously monitors electrocardiogram signals and displays both average and instantaneous heart rates. The aim of this project is to detect and diagnose cardiac diseases so that patients can become informed about their heart health and take appropriate actions based on the results obtained. To identify diseases from real-time ECG data, we used machine learning (ML) classifiers and compared them with offline data to validate the classification. The model we used in our application is pre-trained on the MIT-BIH Arrhythmia Database, which contains a wide range of heart conditions. We used Artificial Neural Network (ANN) as a pre-trained model for multiclass detection as it performed the best among ML models, showing an overall accuracy of 94%. The performance of the model is evaluated by testing it on the application using MIT-BIH test Dataset. On the application, the beat-detecting pre-trained model showed an overall accuracy of 91.178%. The results indicate that the Smart-Health application can accurately classify heart diseases, providing an effective tool for early detection and monitoring of cardiac conditions.

Ucchwas Talukder Utsha, I Hua Tsai, Bashir I. Morshed
Reinforcement Learning Based Angle-of-Arrival Detection for Millimeter-Wave Software-Defined Radio Systems

Millimeter-wave (mmWave) signals experience severe environmental path loss. To mitigate the path loss, beam-forming methods are used to realize directional mmWave beams that can travel longer. Yet, advanced algorithms are needed to track these directional beams by detecting angle-of-arrival (AoA) and aligning the transmit and receive antennas. To realize these advanced beam-forming algorithms in real world scenarios, Software-Defined Radio (SDR) platforms that allow both high-level programming capability and mmWave beam-forming are needed. Using a low-cost mmWave SDR platform, we design and prototype two reinforcement learning (RL) algorithms for AoA detection, i.e., Q- and Double Q-learning. We evaluate these algorithms and study the trade-offs involved in their design.

Marc Jean, Murat Yuksel
Empowering Resource-Constrained IoT Edge Devices: A Hybrid Approach for Edge Data Analysis

The efficient development of accurate machine learning (ML) models for Internet of Things (IoT) edge devices is crucial for enabling intelligent decision-making at the edge of the network. However, the limited computational resources of IoT edge devices, such as low processing power and constrained memory, pose significant challenges in implementing complex ML algorithms directly on these devices. This paper addresses these challenges by proposing a hybrid ML model that combines Principal Component Analysis (PCA), Decision Tree (DT), and Support Vector Machine (SVM) classifiers. By utilizing hardware-friendly techniques such as dimensionality reduction, optimized hyperparameters, and the combination of accurate and interpretable classifiers, the proposed hybrid model addresses the limitations of IoT edge devices. The proposed hybrid model enables intelligent decision-making at the edge while minimizing computational and energy costs. Experimental evaluations demonstrate the improved performance and resource utilization of the proposed model, providing insights into its effectiveness for IoT edge applications.

Rajeev Joshi, Raaga Sai Somesula, Srinivas Katkoori
Energy-Efficient Access Point Deployment for Industrial IoT Systems

Internet of Things (IoT) technologies have impacted many fields by opening up much deeper and more extensive integration of communications connectivity, sensing, and embedded processing. The industrial sector is among the areas that have been impacted greatly — for example, IoT has the potential to provide novel capabilities for more effective tracking, control and optimization of industrial processes. To maintain reliable embedded processing and connectivity in industrial IoT (IIoT) systems, including systems that involve intensive use of smart wearable technologies, energy consumption is often a critical consideration. With this motivation, this paper develops an energy-efficient deployment strategy for access points in IIoT systems. The developed strategy is based on a novel genetic algorithm called the Access Point Placement Genetic Algorithm (AP2GA). Simulation results with our proposed deployment strategy demonstrate the effectiveness of AP2GA in optimizing energy consumption for IIoT systems.

Xiaowen Qi, Jing Geng, Mohamed Kashef, Shuvra S. Bhattacharyya, Richard Candell

Hardware/Software Solutions for IoT and CPS (HSS)

Frontmatter
FAMID: False Alarms Mitigation in IoMT Devices

Wearable and Implantable Medical Devices (WIMDs) and Physiological Closed-loop Control Systems (PCLCS) are crucial elements in the advancing field of the Internet of Medical Things (IoMT). Enhancing the safety and reliability of these devices is of utmost importance as they play a significant role in improving the lives of millions of people every year. Medical devices typically have an alert system that can safeguard patients, facilitate rapid emergency response, and be customized to individual patient needs. However, false alarms are a significant challenge to the alert mechanism system, resulting in adverse outcomes such as alarm fatigue, patient distress, treatment disruptions, and increased healthcare costs. Therefore, reducing false alarms in medical devices is crucial to promoting improved patient care. In this study, we investigate the security vulnerabilities posed by WIMDs and PCLCS and the problem of false alarms in closed-loop medical control systems. We propose an implementation-level redundancy technique that can mitigate false alarms in real-time. Our approach, FAMID, utilizes a cloud-based control algorithm implementation capable of accurately detecting and mitigating false alarms. We validate the effectiveness of our proposed approach by conducting experiments on a blood glucose dataset. With our proposed technique, all the false alarms were detected and mitigated so that the device didn’t trigger any false alarms.

Shakil Mahmud, Myles Keller, Samir Ahmed, Robert Karam
Dynamic Task Allocation and Scheduling for Energy Saving in Edge Nodes for IoT Applications

Internet of Things with an Edge layer is a trending approach in areas such as healthcare, home, industry, and transportation. While scheduling the tasks of such applications, if the edge node utilizes its energy in computing latency-insensitive tasks then it might fail in executing the future latency-sensitive task due to low energy. Thus conserving the energy of the edge node is a key aspect to be considered while designing task allocation and scheduling policies. This can be done by exploiting the inactive state of the edge nodes which is due to less execution time taken than the predicted worst-case time. As this inactive node consumes energy, the best way is to utilize this energy by executing the other node’s task or by transiting to the zero energy state like shutdown. Managing the inactive interval in such a way also reduces the number of idle intervals in the schedule and the overall idle duration of the edge server which effectively reduces energy. In a homogeneous multi-edge (HME) system, techniques like Dynamic Procrastination (DP) combined with migration can help the edge node qualify for the shutdown. Other nodes can be slowed down to execute the tasks with later deadlines using the dynamic voltage/frequency scaling (DVFS) technique to further save energy. Migration combined with DP and DVFS effectively results in improved system utilization and reduced overall energy without affecting performance. This introduces challenges like dynamic allocation of tasks to edge nodes and meeting deadlines. In this work, we propose a dynamic task allocation and scheduling approach for an HME system that can decide on slowing down or shutting down the edge node. We observe that by decreasing the number of idle intervals and increasing the duration of the inactive state, our approach gives improved results for energy consumption over state-of-the-art energy reduction techniques.

Shubhangi K. Gawali, Lucy J. Gudino, Neena Goveas
Deep Learning Based Framework for Forecasting Solar Panel Output Power

Energy Harvesting from diverse renewable energy sources has experienced rapid growth due to the adverse environmental impacts of using fossil fuels. Solar energy is a significant energy source frequently used for power generation in various applications. Due to the variable nature of solar irradiation, temperature, and other metrological parameters, Photovoltaic (PV) power generation is highly fluctuating. This unstable nature of output power has been evolved as a considerable issue in various applications of solar energy prediction system. In this work, a hybrid Deep Learning (DL) model based on Convolutional Neural Networks (CNN), Long Short-Term Memory (LSTM), and Attention mechanism to forecast solar cell output power has been proposed. The proposed model is implemented, and its performance is compared with other DL models, including CNN, LSTM, and LSTM with an attention mechanism. The proposed model has been trained and evaluated with a publicly available dataset which contains 20 parameters on which solar panel output power is relatively dependent. The model yields maximum coefficient of determination (R2) up to 84.5%. A lightweight model has also been developed using the pruning technique to implement the DL model into a low-end hardware.

Prajnyajit Mohanty, Umesh Chandra Pati, Kamalakanta Mahapatra

AI and Big Data for Next-G Internet of Medical Things (IoMT)

Frontmatter
EHR Security and Privacy Aspects: A Systematic Review

Electronic Health Records (EHRs) have become increasingly popular in recent years, providing a convenient way to store, manage and share relevant information among healthcare providers. However, as EHRs contain sensitive personal information, ensuring their security and privacy is most important. This paper reviews the key aspects of EHR security and privacy, including authentication, access control, data encryption, auditing, and risk management. Additionally, the paper discusses the legal and ethical issues surrounding EHRs, such as patient consent, data ownership, and breaches of confidentiality. Effective implementation of security and privacy measures in EHR systems requires a multi-disciplinary approach involving healthcare providers, IT specialists, and regulatory bodies. Ultimately, the goal is to come upon a balance between protecting patient privacy and ensuring timely access to critical medical information for feature healthcare delivery.

Sourav Banerjee, Sudip Barik, Debashis Das, Uttam Ghosh
SNN Based Neuromorphic Computing Towards Healthcare Applications

The diagnosis, treatment, and prevention of diseases may be revolutionized by integrating neuromorphic computing, artificial intelligence (AI), and machine learning (ML) into medical services. A novel method of processing complex data that more effectively and quickly mimics how the human brain works is called neuromorphic computing. This paper provides an overview of neuromorphic computing and its uses in AI and ML-based healthcare. We talk about the advantages and disadvantages of using these technologies as well as how it helps to accelerate the entire diagnostic procedure. We also provide case studies of how neuromorphic applications have been successfully used in the medical field to diagnose and predict diseases. Additionally, we provide the medical and healthcare industries with enhanced Spiking Neural network application results with up to 98.5% accuracy.

Prasenjit Maji, Ramapati Patra, Kunal Dhibar, Hemanta Kumar Mondal
Crossfire Attack Detection in 6G Networks with the Internet of Things (IoT)

As the internet plays an increasingly vital role in our daily lives, the threat of denial of service (DoS) attacks continues to loom, posing significant challenges to network security. With the proliferation of internet-of-things (IoT) devices, including those in the healthcare sector (IoMT), the need to secure these networks becomes even more critical. The emergence of Mobile Edge Computing (MEC) servers has shifted the focus toward processing data near the network edge to alleviate network congestion. However, a new form of DoS attack, known as the crossfire attack, presents a complex challenge as it is difficult to detect and can have devastating effects on networks. While Software Defined Networks (SDNs) offer promise in mitigating DoS attacks, they also introduce vulnerabilities of their own. This paper explores the current landscape of IoT, IoMT, DoS attacks, and crossfire attacks. It discusses existing defense strategies and proposes a defense mechanism that leverages packet header inspection to differentiate between adversarial and benign packets. The paper concludes with the execution of a crossfire attack in a Mininet environment with the RYU SDN controller, highlighting the need for multiple approaches to protect critical servers in the face of persistent DDoS attacks.

Nicholas Perry, Suman Bhunia

IoT for Wearables and Smart Devices (IWS)

Frontmatter
Prediction of Tomato Leaf Disease Plying Transfer Learning Models

The tomato has a high market value and is one of the vegetables grown in the most significant quantity globally. Tomato plants are susceptible to diseases, which can negatively impact the fruit's yield and quality. Detecting these illnesses at an early stage and their accurate identification is necessary for successfully managing diseases and reducing losses. In recent years, deep learning methods such as convolutional neural networks (CNNs) have demonstrated significant promise in identifying plant diseases from images. This research suggested a CNN-based strategy for detecting tomato leaf diseases using transfer learning. Transfer learning enables us to enhance the performance of our disease detection model using a smaller dataset by leveraging pre-trained CNN models that have been trained on large datasets. The proposed transfer learning model through Resnet50 and Inception V3 is effective by applying it to a dataset of tomato leaf images. As a result, a high level of accuracy is achieved and could be indulged for practical applications in agriculture.

B. S. Vidhyasagar, Koganti Harshagnan, M. Diviya, Sivakumar Kalimuthu
Video Captioning Based on Sign Language Using YOLOV8 Model

One of the fastest-growing research areas is the recognition of sign language. In this field, many novel techniques have lately been created. People who are deaf-dumb primarily communicate using sign language. Real-time sign language is essential for people who cannot hear or speak (the dumb and the deaf). Hand gestures are one of the non-verbal communication methods used in sign language. People must be aware of these people's language because it is their only means of communication. In this work, we suggest creating and implementing a model to offer transcripts of the sign language that disabled individuals use during a live meeting or video conference. The dataset utilized in this study is downloaded from the Roboflow website and used for training and testing the data. Transfer Learning is a key idea in this situation since a trained model is utilized to identify the hand signals. The YOLOv8 model, created by Ultralytics, is employed for this purpose and instantly translates the letters of the alphabet (A-Z) into their corresponding texts. In our method, the 26 ASL signs are recognized by first extracting the essential components of each sign from the real-time input video, which is then fed into the Yolo-v8 deep learning model to identify the sign. The output will be matched to the signs contained in the neural network and classified into the appropriate signs based on a comparison between the features retrieved and the original signs present in the database.

B. S. Vidhyasagar, An Sakthi Lakshmanan, M. K. Abishek, Sivakumar Kalimuthu
Improvement in Multi-resident Activity Recognition System in a Smart Home Using Activity Clustering

Human Activity Recognition (HAR) integrates ambient assisted living (AAL), leading to smart home automation for monitoring activities, healthcare, fall detection, etc. Various researchers have proposed a single-resident HAR system for ambient-sensor based smart home data, which is simple, and single-resident is not always the case. Multi-resident recognition is slightly complex and time-consuming. The researchers have made several efforts to generate benchmark datasets, such as CASAS, ARAS, vanKasteren, etc., for baseline comparison and performance analysis. However, these datasets have certain limitations, such as data association, annotation scarcity, computational cost, and even with data collection itself. This paper profoundly analyzed these limitations and manually clustered the activity labels to record the improvement in the performance of the system in terms of both recognition rate and computational time on the ARAS dataset.

E. Ramanujam, Sivakumar Kalimuthu, B. V. Harshavardhan, Thinagaran Perumal

Metaverse for IoT (MIoT)

Frontmatter
Forensics Analysis of Virtual Reality Social Community Applications on Oculus Quest 2

As the popularity of virtual reality (VR) applications increases, there is a growing concern for the forensic aspect of privacy and security of user data. This paper aims to investigate the extent to which forensically relevant user data can be recovered from Oculus Quest 2, specifically, from four social community applications: Horizon Worlds, Multiverse, Rec Room, and VRChat. By performing a forensic analysis on Oculus Quest 2, we were able to determine to what extent forensically relevant user data could be recovered from four social-community applications. Existing efforts in this field are limited. Thus, this study adds to the growing body of research on the forensic analysis of VR applications.

Samuel Ho, Umit Karabiyik
Objective Emotion Quantification in the Metaverse Using Brain Computer Interfaces

Certain human emotions can be quantified by processing electroencephalography (EEG) data. Recent advances in Brain Computer Interfaces (BCI) allow us to record, process and determine user functional intent and emotional implication from such data. The Metaverse captures an extensive spectrum of multi-modal content on the Internet including social media, games, videos, and more complex VR, AR, MR platforms. We propose an objective method to quantify user emotion using EEG data collected through non-invasive BCIs during user interaction. BCI’s qualify as IoT sensors that record EEG data in real-time as users are exploring multimedia content through several emotion-generating scenarios.

Anca O. Muresan, Meenalosini V. Cruz, Felix G. Hamza-Lup
MetaHap: A Low Cost Haptic Glove for Metaverse

This paper presents the design and development of a low cost haptic glove equipped with a range of affordable sensors, including gyroscopes, accelerometers, GPS, servos, and encoders, for use in metaverse environments. The primary focus of this research is to create an immersive and interactiveVirtual Reality (VR) experience by incorporating haptic feedback into the glove. This research provides a detailed overview of the glove’s design and construction, highlighting the integration of Arduino micro-controllers with the various sensors and actuators. The techniques employed to ensure accurate and synchronized data capture are also discussed. Furthermore, the haptic feedback system integrated into the glove is thoroughly explained, including the mechanisms for generating the haptic feed back, which will allow the user to determine the shape and size of the objects in the virtual environment. By utilizing the servos and encoders, the glove can provide users with a tactile experience by simulating the sensation of touching virtual objects or environments. The potential applications of the haptic glove in gaming, virtual training, and medical simulations are explored, emphasizing the benefits of incorporating haptic feedback for enhanced user immersion and engagement. The glove’s versatility and affordability make it a viable solution for a wide range of VR applications. In conclusion, this research presents an Small Board Computer (SBC) - based haptic glove that combines affordable sensors with haptic feedback capabilities, providing users with an immersive and tactile VR experience. The findings contribute to the advancement of VR technology, particularly in the field of haptic interfaces, and open avenues for further exploration and customization of haptic glove applications.

S. Sibi Chakkaravarthy, Marvel M. John, Meenalosini Vimal Cruz, R. Arun Kumar, S. Anitha, S. Karthikeyan

Technologies for Smart Agriculture (TSA)

Frontmatter
CroPAiD: Protection of Information in Agriculture Cyber-Physical Systems Using Distributed Storage and Ledger

The agricultural domain has had a significant role throughout history in human societies across the globe. With the fast growth of communication and information systems, the structure of farming procedures has evolved to new modern standards. Although multiple features helped gain from these advancements, there are many current and rising threats to security in the agricultural domain. The present paper gives novel methods and architectural designs and implements distributed ledger through the Tangle platform. Initially, the article discusses the threats and vulnerabilities faced in the farming sector and presents an extensive literature survey, and later conducts an experiment for distributing data through a tangle distributed ledger system. The authors highlight the limitations of central, cloud, and blockchain and suggest mitigation measures through distributed IOTA systems and distributed storage facilities for data and the possible influence these solutions can bring in the aspects of data security in the agricultural sector.

Sukrutha L. T. Vangipuram, Saraju P. Mohanty, Elias Kougianos
Sana Solo: An Intelligent Approach to Measure Soil Fertility

Worm castings (Worm Excretion) are one the richest natural fertilizers on earth, making earthworms a very important and applicable soil health indicator. According to an article published in the Polish journal of Environmental studies, the most important chemical components of worm castings are pH, total organic carbon (TOC), total nitrogen (N), plant available phosphorus (P), plant available potassium (K), and calcium water soluble (Ca). These chemical components of worm castings, paired with soil temperature, humidity and electric conductivity, are all measurable values that can indicate the overall health and fertility of soil. Furthermore, these physical-chemical properties can also be measured and analyzed to estimate worm populations in soil, making traditional manual extraction techniques obsolete. The proposed project, Sana Solo, is a device that uses machine learning to estimate worm populations based on the quantities of the physical-chemical properties listed above. Being able to estimate earthworm populations in a timely manner, without the use of extraction techniques, can be used in farms and gardens to evaluate soil fertility.

Laavanya Rachakonda, Samuel Stasiewicz
Smart Agriculture – Demystified

To tackle the adverse effects of climate change, unprecedented population growth, natural calamities, and natural resource depletion and to ensure food security, smart agriculture is the future of agriculture. This extended abstract for this invited talk is focused on some of the important points of smart agriculture to raise conscientiousness among the future research community.

Alakananda Mitra, Saraju P. Mohanty, Elias Kougianos

Student Research Forum (SRF)

Frontmatter
WeedOut: An Autonomous Weed Sprayer in Smart Agriculture Framework Using Semi-Supervised Non-CNN Annotation

With rising challenges and depleting resources, many automation solutions have been developed in agriculture. Integration of Internet-of-Agro-Things (IoAT) and Artificial Intelligence (AI) helped gain better yields while maximizing utilization of minimal resources. Weed management being a task affecting quality and yield of crop attracted attention of automation. However, due to the diverse nature of agriculture, same crop from various geographical locations in different growth stages exhibit different features. Additionally, unknown weeds might also exist in the farm rendering feature based supervised CNN solutions not suitable for weed classification. The current paper presents a weed management Agriculture Cyber-Physical System (A-CPS) called WeedOut with a novel methodology enabling it to work in feature variant environments. WeedOut uses a Semi-Supervised methodology that classifies crops by their shapes and labels them as primary crop and weed crop with minimal inputs from farmer. An autonomous weed sprayer uses outputted labeled images to spray herbicide at weed locations and save primary crop.

Kiran Kumar Kethineni, Alakananda Mitra, Saraju P. Mohanty, Elias Kougianos
ALBA: Novel Anomaly Location-Based Authentication in IoMT Environment Using Unsupervised ML

Smartphones have become essential components in the Internet of Medical Things (IoMT), providing convenient interfaces and advanced technology that enable interaction with various medical devices and sensors. This makes smartphones serve as gateways for sensitive data that could potentially affect patients’ health and privacy if compromised, making them primary targets for cybersecurity threats. Authentication is crucial for IoMT security, as its effectiveness relies on its resistance to any conditions of environment, device, or user. In this paper, we propose the Anomaly Location-based Authentication (ALBA) method using GPS technology and a lightweight unsupervised ML algorithm with more stable features. Our experimental results showed that the model successfully identified anomalous locations across three distinct datasets, demonstrating the adaptability of ALBA.

Fawaz J. Alruwaili, Saraju P. Mohanty, Elias Kougianos
A Configurable Activation Function for Variable Bit-Precision DNN Hardware Accelerators

This paper introduces a configurable Activation Function (AF) that utilizes ROM/ Cordic architecture to generate sigmoid and tanh with varying bit precision. Two design strategies are explored: a ROM-based approach for low-bit precision and a Cordic-based approach for high-bit precision. The accuracy of the configurable AF is assessed on LeNet and VGG-16 DNN models, revealing minimal accuracy loss (less than 1.5%) compared to the tensorflow-based model. Experimental results on the Zybo Evaluation kit-Xilinx, using a ‘fixed<9, 6>’ arithmetic representation, demonstrate the ROM-based approach’s memory efficiency, achieving 86.66% LUT savings for 4-bit precision and 80.95% LUT savings for 8-bit precision compared to the Cordic-based approach. The Cordic-based approach, on the other hand, shows $$\approx $$ ≈ 93% LUT savings for 16-bit precision, compared to the ROM-based approach. The proposed AF utilizes the robustness of ROM and Cordic architectures for appropriate bit precision to enhance the overall performance of Deep Neural Networks (DNNs).

Sudheer Vishwakarma, Gopal Raut, Narendra Singh Dhakad, Santosh Kumar Vishvakarma, Dhruva Ghai
Authentication and Authorization of IoT Edge Devices Using Artificial Intelligence

The field of Internet of Things (IoT) has experienced rapid growth, but it has also introduced significant security and privacy challenges. In particular, the authentication and authorization of edge devices pose major concerns due to their limited resources. While various solutions have been proposed, most of them rely on increasing the computing power, storage, and power capabilities of edge devices. However, these solutions are not practical because of the constraints imposed by the small size and cost-effectiveness requirements of IoT edge devices. Some suggestions involve the use of lightweight cryptographic primitives, but not all edge devices have the necessary resources to implement such solutions. This paper presents a novel approach to addressing the authentication and authorization challenges in edge devices by leveraging artificial intelligence (AI). The proposed solution adopts a fog computing model within the framework of a smart home, but it does not depend on the computational or storage capabilities of the edge devices.

Muhammad Sharjeel Zareen, Shahzaib Tahir, Baber Aslam
Secure Dynamic PUF for IoT Security

This student research forum paper is based on our accepted work [1]. The widespread adoption of the Internet of Things (IoT) has brought many benefits to our lives. Still, the low-power, heterogeneous, and resource-constrained nature of IoT devices makes it difficult to ensure secure communication and authenticity. Physical Unclonable Functions (PUFs) provide a promising solution by generating a unique and device-specific identity through manufacturing process variations without requiring additional resources. However, recent advances in machine learning algorithms like artificial neural networks and logistic regression have made it possible to predict PUF responses by training the model. Machine learning models can use multiple challenges and responses to predict accurate results from the PUF. To address this concern, we propose integrating a dynamically configurable PUF structure into the design to counteract machine learning attacks. The dynamicity of the PUF makes it challenging for machine learning models to predict PUF responses.

Shailesh Rajput, Jaya Dofe
Backmatter
Metadaten
Titel
Internet of Things. Advances in Information and Communication Technology
herausgegeben von
Deepak Puthal
Saraju Mohanty
Baek-Young Choi
Copyright-Jahr
2024
Electronic ISBN
978-3-031-45878-1
Print ISBN
978-3-031-45877-4
DOI
https://doi.org/10.1007/978-3-031-45878-1