Zum Inhalt

2025 | Buch

Applied Innovations in Information and Communication Technology

Advanced Techniques and Modern Trends in Information and Communication Technology

herausgegeben von: Stanislav Dovgyi, Eduard Siemens, Larysa Globa, Oleh Kopiika, Oleksandr Stryzhak

Verlag: Springer Nature Switzerland

Buchreihe : Lecture Notes in Networks and Systems

insite
SUCHEN

Über dieses Buch

Dieses Buch beleuchtet die wichtigsten Forschungsbereiche der Informations- und Kommunikationstechnologie und ihre Auswirkungen auf die digitale Gesellschaft und die nachhaltige Entwicklung der Umwelt, insbesondere die Forschung in den Bereichen Informations- und Kommunikationstechnologien, künstliche Intelligenz in IKT, Datenanalyse, Sicherheit von Daten und Dienstleistungen, Reduzierung des Energieverbrauchs im digitalen Umfeld und mathematische Modellierung für praktische und Forschungsaufgaben in den Bereichen Kommunikation und Datenverarbeitung, die von verschiedenen Forschergruppen aus Deutschland und der Ukraine in Zusammenarbeit mit Wissenschaftlern aus verschiedenen Ländern bereitgestellt werden. Die vorgestellten Studien enthalten eine Diskussion über den Einsatz künstlicher Intelligenz, insbesondere Methoden des Deep Learning, die praktische Umsetzung des Internets der Dinge (IoT), die moderne Untersuchung von ECO-Monitoringsystemen, sowie Forschung auf dem Gebiet der mathematischen Modellierung angewandter Probleme. Das Buch konzentriert sich auf die Grundlagen der Informations- und Analyseaktivitäten im digitalen globalen Raum, auf die Bereitstellung von Breitband-Internetzugang ohne Verringerung der Qualität der Erfahrung (QoE), die Verbesserung der Bereitstellung von Dienstleistungen und die Systemarchitektur für SDN. Die Untersuchung moderner Kommunikations- und Informationstechnologien enthält Originalarbeiten, die sich mit vielen Aspekten ihrer Verbesserung und Nutzung für die Vorhersage einer nachhaltigen sozialen und ökologischen Entwicklung auf der Grundlage des globalen Informationsraums befassen, sowie Forschungsarbeiten, die aktuelle Aufsätze enthalten, die einige effektive technologische Lösungen aufzeigen, die für die Implementierung neuartiger Cloud-Infrastrukturen und Radioelektronik-Systeme genutzt werden können. Diese Ergebnisse können für die Implementierung neuartiger Systeme und zur Förderung des Informationsaustauschs in elektronischen Gesellschaften genutzt werden. Angesichts seines Umfangs bietet das Buch eine wertvolle Ressource für Wissenschaftler, Dozenten, Spezialisten in Unternehmen, Doktoranden und Studenten, die sich mit Problemen der Informations- und Kommunikationstechnologie sowie mit Aspekten der nachhaltigen Entwicklung von Gesellschaft und Umwelt beschäftigen.

Inhaltsverzeichnis

Frontmatter

Communication and Data Transport Technologies

Frontmatter
The Method of Multi-vector Routing in Mobile IoT Networks Under the Influence of Strong Interference

The article proposes a method of multi-vector routing in mobile data transmission systems in the presence of powerful interference using turbo codes. The adaptive routing algorithm using turbo coding is complemented by the ability to correct errors during data transmission. This allows for improved network reliability, even at high BER. In case of detection of errors in the received signal, turbo coding allows to correct them with a high probability. The advantages of this algorithm are to reduce the data transmission error even at high BER and improve the reliability and stability of the IoT network. Mobile IoT networks require efficient routing methods to ensure reliable and efficient communication between connected devices. Using turbo coding in combination with multi-vector routing can improve the quality of data transmission and make the IoT network more fault-tolerant. Research in this direction is an urgent task, and further development of these methods can contribute to the improvement of mobile IoT networks. Analysis of multi-vector routing in mobile IoT networks is a challenging task due to the large number of factors that affect data transmission efficiency. The use of turbo coding and proper routing can improve network quality and reliability.

Serhii Zaitsev, Andrii Rohovenko, Yevhen Ryndych, Yuliia Tkach, Borys Horlynskyi, Oleksandr Rudenok
Simulation Modeling Algorithm of Low-Bandwidth Radio Network of Automated Control Systems

Governmental automated control systems of the low echelon management level with the constant movements of users are usually built and operate on the basis of Ultra and Very High Frequency low-bandwidth radio networks. Communication networks of such automated control systems are characterized by low bandwidth, long data transmission delays and jitter. Simulation modeling allows to solve the problem of choosing the necessary technical characteristics of telecommunication equipment to ensure the availability of services even at the design stage. The paper analyzes the generalized model of service quality indicators of radio communication networks of automated control systems based on the low-bandwidth radio networks. The paper determines the average indicators of bandwidth, jitter, channel losses, response time to messages and the distribution laws of intervals between service requests, the average size of a data packet and the required bandwidth for different types of traffic in an automated control system based on a low-bandwidth radio network. For simulation modeling, the paper proposes a radio network maintenance model, which determines the probability of communication network maintenance. Software libraries are proposed for the implementation of a generator of pseudo-random numbers with support for the distribution laws of intervals between service requests for traffic maintenance of governmental automated control systems. Based on the radio network service model and obtained traffic characteristics, an algorithm for the implementation of simulation modeling for the communication network of automated control systems for determining the load on the communication system and determining the probability of service is proposed.

Irina Strelkovskaya, Roman Zolotukhin
Use of Heterogeneous Special Purpose Telecommunication Networks for Provision of Convergent Services

The article considers the possibility of providing converged services when using traditional heterogeneous telecommunication networks and networks used on the battlefield. The management of heterogeneous telecommunication networks must ensure the effective configuration of network elements in order to provide converged services, which makes it more similar to cloud computing than to traditional management and transforms networks into software-configurable networks - SDN. Platforms of network resources and services are central to building SDN for traditional networks. The article presents the architecture of the network services platform, which provides the possibility of providing converged services when using heterogeneous networks. However, such a platform cannot be implemented when heterogeneous networks are used in combat conditions. It is proposed to use a unified mechanism for exchanging information on heterogeneous networks instead of a platform of network services to provide converged services. The unified mechanism uses a public-subscribe architecture, is not tied to the message format (data-agnostic), a light protocol in relation to resources, and most importantly - if necessary, it is possible to switch client connections from internal computer to external network without structural changes, which ensures the invariance of the software product to the subnets used for information exchange. To predict the operation of heterogeneous networks, a method of assessing the quality characteristics of telecommunication networks depending on its structure is considered. Examples of the use of a unified information exchange mechanism on heterogeneous networks for the provision of convergent services are given.

Oleh Kopiika, Volodymyr Kalinin, Alina Lytvynenko
Methodology for Choosing the Best Network Topology for the Multiple Objects Network

Extending broadband Internet access to unserved and underserved populations is one of the core pillars of telecommunication development. Differing physical and economic environments in which service providers must operate is aimed at making it possible to deploy and successfully operate ICT network infrastructures. Putting in place the right and appropriate tools to foster infrastructure deployment is vital to promoting digital inclusion through universal access to fast, reliable online technologies and services. This article as a part of the Broadband Connectivity Toolkit Methodologies Series describes methodology for choosing the best network topology for the multiple objects network (network of localities, schools, hospitals etc.). The described algorithm implies the idea of larger NPV (cost of ownership) for optimal path between two or more objects (localities). The initial data for whole algorithm are geographical coordinates (longitude & latitude) of investigated objects; geographical coordinates of the nearest backbone point (if available); required capacity for investigated objects; and costs of infrastructure facilities deployment and maintenance.

Vadym Kaptur, Volodymir Baliar, Olena Mazurkiewicz
Turing Machine Development for High Protected Remote Control of the IoT Mobile Platform

The issues of IoT networks and systems privacy considered with particular focus on mobile robotics remote control. Despite the IT-cybersecurity significant progress, new tasks and threats emerge, such as unmanned mobile devices of mass disposable with remote control, for which known approaches are not efficient enough. In this respect, a novel Turing machine-based method has been developed in this work for high-protected remote control of an IoT mobile platform. An original flow-chart of IoT mobile platform control introduced. A special system-interoperability model originated for IoT-platform remote communication. A Turing machine formalism defined for flexible frame encoding, along with a sender-receiver interaction protocol. A Turing machine tags-table specified on the set of ternary alphabet-symbols, as well as a formal grammar syntax for volatile frame encoding with two typical patterns. A Turing machine algorithm has been constructed for high-protected control of a robotics mobile platform. Due to this, special mobile objects can apply unique low-cost encryption schemes of unpredictable frame encoding to make communication channel harder to hack. The results of the work intend to improve the mobile objects cyber-threats protection, as well as to remote vehicle control and other real-time IoT-applications.

Victor Tikhonov, Abdullah Taher, Kateryna Shulakova, Serhii Tikhonov, Olexander Demchenko
Distributed Multi-agent Systems Based on the Mixture of Experts Architecture in the Context of 6G Wireless Technologies

The chapter explores the issue of creating distributed multi-agent systems based on the Mixture of Experts (MoE) architecture in the context of sixth-generation (6G) wireless technologies. The problem lies in ensuring effective coordination and interaction of millions of experts deployed across various hardware platforms using high-speed wireless networks. The general steps to solve this problem include scaling MoE systems to a system of systems, as well as the introduction of intelligent resource management mechanisms and ensuring high performance through the use of digital antenna arrays and digital beamforming to mitigate interference. The article proposes a solution based on combining distributed systems of experts interacting via high-speed wireless technologies to optimize performance and reliability. The main idea of the approach is to use the MoE architecture, which allows the distribution of computational resources depending on the tasks, ensuring high flexibility and scalability of the system. The approach has been checked by analyzing the integration capabilities of artificial intelligence technologies for adaptive network management and using high-speed wireless technologies. Additionally, a modification of Orthogonal Time Frequency Space (OTFS) modulation based on Non-Orthogonal Frequency Division Multiplexing (N-OFDM) has been proposed to increase spectral efficiency and reduce packet duration by using overlapping impulses. The research results demonstrate the possibility of achieving high performance, reliability, and resource optimization, making the proposed approach promising for use in various industries.

Vadym Slyusar
Technological Principles for Building a Network Architecture of Service Data Processing Centers

The paper addresses a particular type of IT infrastructure development. It discusses the architecture of the Corporation’s network data processing centre of IT infrastructure, which is designed to provide reliable, scalable and available communication with the network at both the physical and logical levels following the requirements of the enterprise. To ensure the proper network services for application programs, the network architecture should be designed considering the security system architecture, which sets specific requirements at the structural (devices) and logical (configurations) levels. The following requirements were met when designing the network architecture: availability, security, scalability, manageability, support, consolidation, and interoperability. The paper discusses an example of a data transmission network for Ukrtelecom, the national electronic communications operator. The Corporation’s network is logically divided into security zones using perimeter protection services. Seven main security zones have been established in the Corporation. A method of extreme partial flows is presented to optimise the technical parameters of the network. The algorithm of the iterative method of extreme flows for determining the data transmission network throughput for a given load in directions is described, allowing the determination of the optimum load distribution by the maximum flow of communication lines and network nodes for each direction. The method enables measuring the TN throughput for a given load across the directions and determining the optimum load distribution by the maximum flow of communication lines and network nodes for each direction.

Oleh Kopiika, Stanislav Dovgyi, Andrii Yaremenko
Comparison of Performance and Power Consumption in Sigfox, NB-IoT, and LTE-M

This study provides an in-depth analysis of the power consumption characteristics of Sigfox, Narrowband-IoT (NB-IoT), and Long-Term Evolution for Machines (LTE-M) technologies. The aim is to identify limitations and opportunities for improving energy efficiency across these Low Power Wide Area Network (LPWAN) technologies by keeping maximal ranges of operation. The experimental setup involves testing the transceiver modules for each of the three technologies in a controlled environment. Parameters such as transmission power, signal-to-noise ratio (SNR), received signal strength indicator (RSSI), round-trip time (RTT), uplink, delay, downlink, and sleep periods have been evaluated. Various network configurations and scenarios have been examined to assess their impact on transceiver module performance. Special emphasis was placed on the power consumption of each transceiver module, with attention to how different transmission powers, message sizes, bit rates, and modulation schemes affect the energy consumption. The study also explores strategies for optimizing device energy consumption and extending battery life across the tested technologies.

Simeon Trendov, Eduard Sariiev, Khalid Bsheer Suliman Mukhtar, Dmitry Kachan, Eduard Siemens

Trends in the Development of Information and Communication Technologies and Systems

Frontmatter
Approach to Effective Query Execution

Databases vary greatly in size, from small datasets to collections containing terabytes of information. As databases grow, the time required to retrieve data increases, slowing server response times. While advancements in hardware can help, they often fall short of meeting the demands of large-scale data processing. Optimizing SQL queries offers a more efficient solution for improving database performance. With data volume expanding exponentially, the urgency for effective query optimization has intensified. This paper introduces a novel method for optimizing SQL query execution, focusing on performance and usability. The approach synthesizes complex SQL queries from multiple simpler ones to improve execution speed, especially when traditional methods are insufficient. A core strategy involves replacing the commonly used IN operator with a temporary table and a non-clustered index. This method significantly reduces data retrieval times by lowering the number of logical references, resulting in faster data selection, more efficient resource use, and reduced server load. The study achieved its goals through three main tasks: (1) Analyzing existing SQL optimization techniques and their limitations, (2) developing a new optimization method to enhance performance and scalability, and (3) testing the effectiveness of this method through experimentation. This paper contributes a practical solution to the growing demands of modern database systems, providing a scalable, efficient approach to handle increasing data volumes and operational requirements.

Svitlana Sulima, Larysa Globa, Olexander Iermolaiev
Possible Features of Designing Telemedicine Networks and Telemedicine Stations

Despite the rapid growth of eHealth sphere and particularly telemedicine services the scientific problem of creation, development of telemedicine networks (TMN) and telemedicine stations (TS) stays actual and relevant, especially at desiging stage and during the technical implementation of TMNs and their nodes. It results in complete or partial incompatibility of existing designs, construction documentation and blueprints as well, which leads to difficulties with various fragments of TMNs and impedes international cooperation in this field. The solution to this problem starts by defining TS as a composition of its subsystems and then defining an extended TS classification model. The proposed classification model is based on the faceted classification method. It allows constructing classes by using any combination of characteristic features of TSs, while omitting and not using some of the characteristic features. A mathematical description of the classification model is provided as a faceted formula. To begin designing TS, it is necessary to determine its class according to extended classification model and synthesize its morphological model based on which it would be possible to formulate technical specifications for the design. With this purpose, the approach is proposed based on system analysis and business analytics methods and considers inter-viewing a subject matter expert - medic. A special “necessity-sufficiency” framework has been developed, which allows one to formulate, systematize, and rank questionnaires to an expert. The questions are not limited to the designed TS only, but also cover the related factors and design conditions. The approbation was done for real geographical objects.

Roman Tsarov, Vladyslav Kumysh, Iryna Tymchenko
Applying Adaptive Learning Techniques for Studying of Installation Process an Operating System on a Personal Computer

General computerization has already reached a level at which any professional PC user, not even a programmer, should be able to install an operating systems (OS) on his computer that meets his needs and wishes. The wide spread Windows OS, although it provides the largest selection of software, is quite expensive and overloaded with packages of mandatory utility programs that are unnecessary for the user. Linux-based OSs do not have these disadvantages, since they are completely free and allow you to install only the programs that the user needs. However, the wide use of OSes based on the Linux kernel is limited by user ignorance in this area and concerns about the complexity of the installation process of this OS. At the same time, recently in pedagogy much attention has been paid to research into methods for accelerating the learning process, due to the constant increase in the amount of information that a student must learn. The most popular technology for increasing the intensity of learning is the adaptation of complex educational material for study by students with different levels of training. Such a labor-intensive task for a teacher requires the use of several already known adaptive teaching methods, depending on the teaching conditions and the complexity of the educational material. This paper discusses the features of using adaptive learning techniques to increase the accessibility and improve the perception of such complex educational material as the sequence of operations and commands for installing an OS based on the Linux kernel.

Irina Strelkovskaya, Tetiana Hryhorieva, Viktor Gorbachev, Rostyslav Bykov
Determination the Conversion Power of the Most Popular Switching Converters for Computer Systems

In the field of switching conversion, it is considered that only the power of the converter is determining its size, weight and cost. But the power of the converter does not fully determine its parameters. The article shows that in a switching converter it is possible to allocate a virtual or real conversion unit than transmit energy through the magnetic field of inductor, and that the power of conversion unit can be less than or equal to the power of the converter. Therefor the difference between power of the converter and conversion power is existed. The article defines the conversion power for the most popular switching converters used in computer systems. Using known formulas, a check of the conversion power calculation for Buck, Boost and Buck-Boost converters in which the inductors operate in the same modes was performed. It shows the correctness of the theoretical assumptions. The introduction of the terms “Conversion Power” and “Conversion Unit” into the theory of switching conversion allows for a deeper understanding of the processes of switching energy conversion and the creation of new converter circuits for computer systems with better technical characteristics.

Oleksandr Rusu, Denys Rozenvasser, Anna Vakarchuk
Achieving Cyber-Physical Consistency for Immersive Robot Controlling

In cyber-physical systems, the digital shadow of the physical system may gradually degenerate and deviate from reality, causing cyber-physical inconsistency. This is a major problem since consistency must be constantly preserved for accurate remote control of the physical system, raising a need for immediate identification and correction. In this paper, we investigate cyber-physical inconsistency for immersive telerobotics. We give a state-of-the-art survey, investigating different forms of cyber-physical inconsistencies and showing how they can be handled. We propose a software system for immersive telerobotics, which unifies the approaches for handling different forms of cyber-physical inconsistencies, such as real-time virtual reality immersion into the robot environment, feedback for operator hands through exoskeleton gloves, timestamp labeling for motion tracking, and, synchronization of operator arm to robot arm coupling through vibration feedback; within a single framework. We have successfully applied our software system to an immersive telerobotics scenario, showing that it is a step forward towards a holistic system enabling both availability and consistency in telerobotics.

Mikhail Belov, Dmytro Pukhkaiev, Victor Victor, Uwe Aßmann
Quantifying the Economic Impact of Investment Activities: Methods and Applications

The article delves into the multifaceted realm of the Economic Value of Investment Activities (EVIA) within Ukrainian mobile providers, traversing the intricate landscape of value creation and financial dynamics. At its core lies a nuanced exploration of the challenges that reverberate throughout the industry, with a particular focus on the evolving paradigms of customer value perception and the intricate web of financing strategies. These challenges, deeply entrenched within the structural fabric of the Ukrainian telecommunications market, transcend mere managerial oversights, reflecting broader systemic intricacies that necessitate a paradigm shift in investment strategy assessment. In response to these exigencies, the authors advocate for a transformative methodological framework that transcends conventional metrics, one that integrates the diverse perspectives of stakeholders and harnesses the analytical prowess of the Distance from a Benchmark Method (DBM). This innovative approach not only offers a holistic understanding of EVIA but also facilitates a comprehensive comparative analysis across multiple companies, illuminating the nuanced interplay of various factors shaping investment efficiency within the multifaceted telecommunications sector. By applying the DBM approach to EVIA assessment among Ukrainian mobile operators, the article unveils profound insights into the fundamental challenges plaguing optimal capital utilization, the dynamic nature of customer preferences, and the strategic imperatives of financing differentiation. In doing so, it provides a roadmap for strategic intervention and optimization, equipping industry stakeholders with the analytical tools necessary to navigate the complex terrain of investment activities in the ever-evolving Ukrainian telecommunications market.

Iryna Stankevych, Inna Yatskevych, Hanna Sakun, Oksana Vasylenko, Eduard Siemens
Analysis of Modern Solutions for the Identification of Anonymous Users

In the digital world, new software, websites, and services are constantly emerging, which in turn requires cybersecurity specialists and tools to protect the resources of information systems from threats. Security violators make every effort to ensure their anonymity, making their real-time detection crucial. By analyzing data from leading research, specific services and solutions aimed at identifying anonymous users have been highlighted. However, these publications have not yet developed a comprehensive set of criteria that characterize the relevant tools, which would allow for a systematic approach to selecting solutions for addressing cybersecurity tasks. Therefore, analyzing modern cybersecurity solutions for identifying anonymous users and the corresponding existing services for ensuring the cybersecurity of critical infrastructure is an urgent scientific task. An analysis of tools and solutions for identifying anonymous users was conducted based on criteria such as accuracy, speed, content delivery network, web application firewall, DDoS attack detection, bot detection, fraud and device identification, ease of integration, behavioral analysis, use of artificial intelligence technologies, local or cloud-based deployment, open-source availability, confidentiality assurance, and the cost of solutions for ensuring the cybersecurity of websites and critical infrastructure. A comparative characterization was also conducted and services with the best user identification accuracy for countering cyber threats were identified and their features, differences, strengths, and weaknesses were highlighted. The results obtained enable a systematic approach, based on a range of criteria, to timely detect incidents and minimize the risks of new cyberattacks, thereby ensuring the cybersecurity of critical infrastructure.

Oleksandr Korchenko, Anna Korchenko, Ivan Azarov
Ontology-Driven Approach to the Structuring of Information Resources Describing a Subject Domain

The information space of the modern world is expanding at an unprecedented rate. The amount of information that experts have to work with is becoming so vast that traditional methods and tools are becoming insufficient. This creates a growing need for automating information processing to reduce errors and boost efficiency. This requires the development and application of new specialized software tools. However, the creation of such tools requires appropriate resources, specialized knowledge, and computational infrastructure, which many experts do not possess. This can be solved by implementing efficient methodologies for such automation that are easy enough to learn and to utilize. Such methodology is proposed, which is based on the use of ontologies. Ontologies allow for the formalized description of a subject area by structuring knowledge in the form of concepts and the relationships between them. A proposed approach involves the use of ontology-driven systems, which allow effective processing of large weakly structured and unstructured datasets. The flexibility and adaptability of ontology-driven systems are among their key advantages, which allows rapid deployment of information-analytical systems, and their easy adaptation to new conditions and needs. An example of the successful application of this approach is provided, which involves the creation of a system for monitoring and analyzing damaged property, stolen equipment, and the loss of human resources in Ukraine's scientific and educational sector. This monitoring aids Ukraine in assessing the scale of losses due to criminal military aggression, enabling prompt responses to ongoing challenges.

Vitalii Prykhodniuk, Viacheslav Gorborukov
A Comprehensive Integration of Practical Strategies in DevOps

This scientific article presents a comprehensive investigation into the amalgamated technical expertise cultivated through hands-on engagement in commercial projects. Its overarching aim is to align the training of emerging DevOps engineers with the dynamic requirements of the real-world industry. Central to our inquiry is the meticulous examination of the intricate stages within CI processes, informed by the rich practical insights of seasoned DevOps professionals. Our exploration reveals invaluable insights, extrapolating their relevance to benefit both IT educators and active DevOps practitioners alike. Rooted in the pragmatic utilization of a diverse spectrum of DevOps practices, we elucidate these stages with real-world examples and experiences. By drawing upon multifaceted encounters encountered in real-world scenarios, our article offers a nuanced comprehension of the challenges and successes inherent in the DevOps landscape. This nuanced approach not only enriches the theoretical knowledge transmitted to future engineers but also furnishes a repository of practical wisdom readily applicable in professional environments. The article also examines the influence of integrating AI into DevOps workflows as a transformative factor. Building on the foundational practices of DevOps, AI’s role in automating routine tasks, optimizing CI/CD pipelines, and enhancing decision-making processes exemplifies how cutting-edge technology bridges the gap between theory and practice.

Liliia Bodnar, Mykola Bodnar, Kateryna Shulakova, Oksana Vasylenko, Eduard Siemens, Oleksandra Tsyra
Methods of Spline Functions in Information Technologies

The paper considers solving problems in information technologies by means of spline approximation and spline extrapolation on the basis of real and complex spline functions. The use of spline approximation based on real quadratic splines to solve the problem of self-similar traffic recovery on the example of voice VoIP traffic is shown. The use of real parametric linear spline functions in the problems of information technologies, namely, the problems of constructing curves and surfaces in 3D modelling, is proposed. It is shown that the use of parametric spline functions in the tasks of reproducing curves and surfaces or approaching the shape of a modelled 3D object allows minimizing the approximation and reproduction errors. To solve the problem of user positioning in the Wi-Fi network, we consider the use of spline approximation based on complex plane spline functions, which allows to improve the accuracy of user positioning in the room. The method of spline extrapolation based on real spline functions is used to detect and predict DDoS cyber attacks. Solving the problem of predicting the characteristics of DDoS cyberattack traffic allows detecting the characteristics of malicious traffic, thereby preventing the destructive consequences of DDoS attacks. It is shown that the use of methods of real and complex spline functions in information technologies allows to obtain results of increasing the accuracy and scalability of the obtained solutions.

Irina Strelkovskaya, Irina Solovskaya, Juliya Strelkovska

Data Analysis and Processing

Frontmatter
Adaptive Clustering for Distribution Parameter Estimation in Technical Diagnostics

This study addresses the problem of accurate distribution parameter estimation for reliability assessment within technical diagnostic systems for electronic components. We propose a novel adaptive clustering approach using wavelet transformation to improve the robustness and accuracy of parameter estimation for exponential and Weibull distributions in electronic component defect testing. The method specifically addresses challenges in high-noise, small-sample environments by dynamically adapting to cluster center drift over time, thus reducing errors in the testing process. Validation experiments on resistor batches demonstrate the method’s resilience to noise, achieving a signal-to-noise ratio of 1.17 with reduced relative error to a level suitable for practical applications. This robust approach is particularly valuable for automated selection of electronic components used in critical systems, where failure tolerance is low and reliability is paramount. Our results indicate the proposed method significantly enhances the reliability and accuracy of diagnostic processes, establishing it as a practical tool for reliability testing and parameter estimation in automated diagnostic systems for electronic components.

Galina Shcherbakova, Svetlana Antoshchuk, Daria Koshutina, Kiril Sakhno, Serhii Kondratiev
Semi-Autonomous Object Retrieval with Robot Arm and Depth Camera for Quadruped Robots

The rapid aging of populations in developed countries, including Germany, presents significant challenges, particularly concerning dependency among older adults. This paper addresses the critical need for assistive technology to aid older individuals in performing simple daily tasks, focusing on developing a semi-automatic robotic system using the Unitree Go1 platform. The research aims to enhance the quality of life for older adults by enabling them to interact with objects in their environment more independently. By utilizing a depth camera, the system estimates object geometry and location, planning the robot's body and arm movements for effective object manipulation. The study integrates various subsystems, leveraging existing libraries and community-developed packages to implement a grabbing strategy. Multiple approaches are developed and assessed based on behavior and accuracy, with the best-performing method achieving an 84% success rate in varied testing conditions. This paper provides a detailed review of research and implementation, potentially accelerating future research and product development in assistive robotics.

Dafa Faris Muhammad, Subashkumar Rajanayagam, Stefan Twieg
Power-to-X Strategies: A Key Driver for Decarbonization and Renewable Energy Integration in Economies

The 21st century has witnessed a steady rise in global greenhouse gas emissions, correlating strongly with economic growth, particularly in emerging economies. Shifting from fossil fuels to renewable energy offers a promising way to reduce CO2 emissions. However, achieving carbon neutrality and effectively combating climate change requires comprehensive global cooperation. Green hydrogen and sustainable fuels, such as e-methanol and e-kerosene, produced via the Power-to-X technologies, are recognized as viable options to lower emissions and promote decarbonization. While hydrogen production presents environmental challenges, including its energy-intensive nature and reliance on fossil fuels, transitioning to green hydrogen through renewable-powered electrolysis holds promise, albeit with infrastructure and cost considerations. Adopting sustainable fuels faces some techno-economic challenges, primarily due to their higher costs compared to fossil fuels. The economic viability of these fuels depends on the cost of green hydrogen and carbon capture technologies. To make sustainable fuels cost-effective and competitive with fossil fuels by 2050, large-scale deployment, advancements in electrolysis, and extensive use of point-of-source CO2 capture technologies are essential. Nonetheless, it appears unlikely that e-fuels will be readily accessible at a reasonable cost in the foreseeable future. The potential environmental impact of adopting e-methanol and e-kerosene is assessed, emphasising the need for a comprehensive analysis of potentially adverse effects on human health and ecosystems.

Halina Falfushynska, Markus Holz
Estimation of the Repeater Span Length of OTH Transmission System with QAM Modulation

Fiber-optic transmission systems (FOTS) play a crucial role in establishing broadband communication channels across telecommunications networks, offering high data rates and reliability. In modern transport communication infrastructures, Optical Transport Hierarchy (OTH) transmission systems are the most widely adopted due to their efficiency in handling large volumes of data traffic. This paper introduces a novel approach to determining the optimal repeater span length in FOTS that utilize quadrature amplitude modulation (QAM), a technique known for improving spectral efficiency. To achieve this, an advanced simulation model is developed within the MatLab environment, designed to comprehensively analyze the performance of OTH systems with various QAM-M modulation levels. The study explores how adjusting signal power at the transmitter output can enhance transmission efficiency and extend the repeater span length. Results demonstrate that for an OTU3 optical channel, the maximum achievable repeater span length of 1500 km is obtained using QAM-4 modulation. This finding emphasizes the potential for substantial improvements in long-haul communication by optimizing modulation techniques and transmitter power levels, making FOTS even more viable for extensive broadband network deployments.

Larysa Yona, Volodymyr Pedyash, Denys Rozenvasser, Anna Mazur, Yulia Bairamova
Smart Adaptive Lighting Based on Determination of Human Healthy and Circadian Rhythms

Currently, IoT systems are becoming widespread in the healthcare system. Today’s e-Health systems have advanced functional capabilities. They can control the indoor microclimate, determine geolocation parameters, interact with a person via a Web interface. However, the ability to adapt to human physiological parameters remains a complex issue that requires a deep understanding of a particular person’s physiological processes and the physical processes occurring in the IoT system. This study presents an approach to intelligent adaptive lighting system development based on research in the fields of electronic engineering, computer science, e-health, a person’s physiology. Taking into account the various light sources that influence the system using human circadian rhythms can adapt lighting to the physiological needs of a person, preventing disruption of their circadian rhythms, and therefore, deterioration in the sleeping quality, drowsiness, fatigue, low productivity, a person’s mood deterioration. Adaptation of lighting to human circadian rhythms is performed taking into account both physiological data and the room microclimate parameters where the person is located. To implement the proposed approach to an intelligent adaptive lighting system development based on human circadian rhythms, a device circuit for circadian rhythms determination and control was proposed, which was tested in laboratory conditions, and the functionality of the IoT system was simulated in Matlab. The practical implementation of the developed IoT adaptive lighting system in the practical projects showed an increase in the comfort of people’s stay in the room, as well as an increase in the energy efficiency of the lighting system by 11%.

Yehor Zheliazkov, Julia Jamnenko, Larysa Globa
Automated 3D Sign Language Animation Using Machine Learning Algorithms

This project develops a system that transforms written or spoken words into sign language animations, specifically Macedonian Sign words (MSL), to help individuals who are hard of hearing communicate. A virtual interpreter that transforms spoken language into animated sign language is developed using 3D visualization, computer vision, and machine learning. Although there are only approximately thirty MSL interpreters available in Macedonia for the deaf community, the system utilizes pose estimation technologies, such as Blaze Pose, to extract body keypoints from 2D video inputs. This makes it possible for the creation of precise 3D sign language movements. These keypoints are mapped to a skeleton with specialized software, and the skeleton data is then used to generate realistic, dynamic sign language animations for 3D models. Techniques like interpolation and inverse kinematics are used to generate and modify the animations in order to guarantee realistic movements and movement. This project could be expanded to include other sign languages around the world, improving accessibility in a variety of domains such as education, healthcare, and public services. It improves social inclusion and equal opportunity for the deaf and hard-of-hearing community while also revolutionizing sign language education. The goal is to develop a scalable solution that may be expanded to support more sign languages, so contributing to more inclusive communities globally.

Teodora Kochovska, Bojana Velichkovska, Marija Kalendar
A Model for Optimizing Packet Length in Airspace Surveillance Systems

The work analyzes the process of functioning of aerial surveillance search radar systems: friend or foe identification systems, secondary surveillance systems, radar systems near navigation. This demonstrates that which construction existing radar interrogation systems can be represented as an asynchronous network for transmitting radar information, also an order processing signals interrogations can be described in a form a one-channel queueing system with failures. Using of simple coding of request and/or response signals does makes impossible ensuring a given quality level of service of airspace surveillance and traffic control systems in the face of significant intra-system errors, and also intentional interference correlated or/and uncorrelated. This article provides the mathematical description of a request servicing process and examines the dependence of the probability of an error in the system on the length of a data packet. An influence not only single and group errors in the radar information transmission channel was taken into account, but also the requirements for the permissible number of repeated requests. An algorithm for adaptive control of the transmission of radar information was proposed. This algorithm involves analyzing the characteristics of the wireless data transmission medium and makes it possible to dynamically change various parameters of the access control level to the physical medium depending on changes in the signal propagation environment. Consequently, it becomes possible to find optimal settings for a specific distribution environment, ensuring optimization a length а data packet of packets data being transmitted.

Iryna Svyd, Ivan Obod, Anton Romanov, Oleksandr Vorgul, Mykyta Romanov, Ashish Yadav, Haider T.H. Salim ALRikabi
Using the Interstage Data Processing Method to Improve the Efficiency of Airspace Surveillance Systems

The presented work analyzes the existing structure of data processing in airspace surveillance radar networks, which includes sequential execution of data processing stages (including signal processing and primary data processing stages). Based on the analysis, it is shown that the lack of inter-stage optimization leads to a decrease in the quality of data processing. Which in turn leads to a decrease in the quality of information support for consumers in the existing information network of airspace surveillance radar systems. The paper synthesizes an optimal structure of joint processing of signal data and primary processing data in airspace surveillance radar systems networks. The synthesized structure of joint processing of signal data and primary processing data allows implementing two methods of data processing: combining decisions at the level of data on detection of received signals from surveillance systems; combining at the level of making decisions on detection of air objects. The method of data processing by an airspace surveillance radar system, in which data are combined at the level of making decisions on detection of air objects for each signal data processing channel, ensures inter-stage optimization of network radar data processing and primary data processing. The proposed method has significant advantages in the quality of surveillance system data processing in comparison with the one currently used. This ensures an increase in the quality of information services for consumers of the considered information network of radar surveillance systems of airspace.

Iryna Svyd, Ivan Obod, Anton Romanov, Oleksandr Vorgul, Mykyta Romanov, Ashish Yadav, Haider TH. Salim ALRikabi
Analysis of the Efficiency and Comparison of Retrieval-Augmented Generation Systems in Mergers and Acquisitions: Recommendations for Choosing the Best Solutions

This paper addresses the underexplored evaluation of Retrieval-Augmented Generation (RAG) systems in the financial sector, specifically focusing on mergers and acquisitions (M&A). We developed a comprehensive assessment framework to evaluate existing RAG solutions for M&A companies. Engaging financial experts, we compiled a dataset of frequently asked questions and benchmark answers. Multiple RAG systems—including Azure GPT-4, AWS Q, and others—were configured to generate answers to these questions. An evaluation application was implemented to automatically assess the quality of the generated answers using thirteen key performance indicators, combining large language models and human expert evaluations. Consolidated into a standardized leaderboard, the results revealed that Azure GPT-4 with Azure AI Search outperformed other systems by a 15% margin, demonstrating high accuracy and relevance. Our study provides organizations with actionable insights for selecting appropriate RAG solutions, highlights areas for improvement in existing systems, and contributes to advancing RAG technologies in the financial industry.

Oleg Barabash, Danylo Vorvul, Andrii Musienko, Vladyslav Bezsmertniy
Using Google Earth Engine Cloud-Based Service in Upskilling Course for Educators of the Junior Academy of Sciences of Ukraine

The paper describes the outcomes of the “Fundamentals of Remote Sensing: Processing and Analysing Satellite Imagery on Google Earth Engine Platform” upskilling course for educators conducted by the GIS and Remote Sensing Laboratory of the National Centre “Junior Academy of Sciences of Ukraine”. It discusses the theoretical requirements for organising an educator course and the topics on which the Google Earth Engine platform can be used in the educational process. The course described in this paper was first held online in Ukraine in the winter of 2024 and was open to all interested educators from Ukraine. A total of eighteen educators from 11 regions of Ukraine took part in the upskilling course. Most educators specialize in geography; there are also experts in ecology, geology, physics, computer science, geoinformation technologies, remote sensing, etc. The participants’ responses in two questionnaires and a final interview were used to measure the course effectiveness. The paper describes the stages of the course planning, topics covered and tools used to deliver lectures and practical classes. The paper features examples of research projects completed by the course participants using the knowledge and skills in remote sensing acquired during the training, which were presented at the closing conference.

Stanislav Dovgyi, Svitlana Babiichuk, Olha Tomchenko
Digital and Economic Security of the State Under Global Threats

Accelerated digitalization of the Ukrainian national economy, acting as a driver of economic transformations during a large-scale war, leads to the emergence of specific destructive phenomena (militarization of cyberspace, information wars, information terrorism, attacks on the digital infrastructure, large-scale open and covert cyberattacks for the conduct of hostilities and intelligence-subversive activities). The generation of threats in the digital sphere typically requires relatively minimal expenditure, is accompanied by low risk to the attacker, and has the potential to inflict significant damage. This actualizes the problem of ensuring the state economic security and protecting national economic interests on the principles of information security. The research is aimed at studying an important global scientific problem of ensuring digital and economic security under global threats. It assumes using methods, principles, tools of a proactive approach and the advantages of digitalization based on timely detection of potential and real threats, realization of the deterrence potential and cyber resilience, ensuring the security of information resources to increase the economic security of Ukraine in the war and post-war periods.

Svitlana Onyshchenko, Oleksandra Maslii, Аlina Hlushko

Mathematical Modeling in Applied Problems

Frontmatter
Advanced Pollutions’ Monitoring in the Mediterranean Sea: AI-Based Approach Using Satellite Data and Products

This study focuses on assessing pollutants concentrations in the Mediterranean Sea using satellite imagery and products. Traditional methods for measuring pollutants are labor-intensive, expensive, and have limited spatial and temporal coverage. As a result, satellite observations have become an effective solution for monitoring seawater across expansive areas. While low spatial resolution satellite datasets enable surface-level measurements, their restricted accuracy and low-resolution present challenges in detecting localized changes in coastal zones. To enhance the spatial resolution and accuracy of sea pollutants assessment, we have proposed an information technology based on machine and transfer learning techniques, which applied to pilot areas of the HORIZON Europe IMERMAID project in the Mediterranean Sea. Our approach integrates low-resolution satellite data with ground-based in-situ data and Sentinel-2 satellite imagery. Within the study we investigated key challenges in processing both satellite and ground data, conducting a comparative analysis among various satellite datasets (Sentinel-1, Sentinel-2, Sentinel-3, GCOM-C/SGLI) and ground datasets to identify the most informative indicators influencing sea pollutants’ levels. We used machine learning algorithms, particularly Random Forest and Multi-Layer Perceptron to develop an information technology that improves the spatial resolution of sea pollutant concentration maps by leveraging Sentinel-2 satellite optical data. This technology allows to generate maps of sea pollutants, such as chlorophyll-a, with a spatial resolution of 10 m for the pilot areas of the iMERMAID Horizon Europe project in the Mediterranean Sea. Due to the lack of data for the pilot regions the common approach is based on a transfer learning technique.

Andrii Shelestov, Pavlo Henitsoi, Bohdan Yailymov, Nataliia Kussul
Integration of Digital Technologies and Artificial Intelligence in the Ecomonitoring System of the Black Sea Coast

The existence of the Black Sea water area of Georgia is an important factor in its geographical location, which determines the country’s political, social, and economic potential, especially in terms of foreign relations, marine industry, and tourism development. It has a great influence on the formation of the country’s climate, a scientific study of the diversity of the Black Sea and the coastal region, which from a multidisciplinary point of view will study the condition of hydro resources (rivers, lakes, etc.), existing seaports and oil terminals in this region, probable risks and their impact on the quality of the Black Sea water, especially in the background of climate change, it is very important and relevant for the blue development of the Black Sea. Within the framework of the project, the tasks of creating the concept of the information system supporting eco-monitoring of the mentioned region, automated design, and software implementation were addressed. This was based on the system analysis of the region’s hydro sources and sensitive areas, along with new digital technologies. Object- and process-oriented models of their research were created, with automated initial data collection devices (IoT sensors) and processing methods based on appropriate machine learning algorithms and programs. In addition, a service-oriented web portal creation methodology was developed. The portal served experts of the Black Sea eco-monitoring system and managers working on artificial intelligence in the decision-making process.

Gia Surguladze, Nino Topuria, Lela Mirtskhulava, Lily Petriashvili
Limit Behavior of a Semi-Markov Process Depending on a Small Parameter for Telecommunication Systems

The limit behavior of a semi-Markov process, depending on a small parameter, is important for the analysis and optimization of telecommunication systems. Semi-Markov processes are an extension of Markov processes that allow modeling systems with preservation of some previous states. Such processes are important, for example, for modeling data transmission over wireless channels or communication networks. The limit behavior of a semi-Markov process is determined by a small parameter, which is usually denoted as $$\varepsilon $$ ε (epsilon). Various important characteristics of the system depend on the limit value of $$\varepsilon $$ ε , such as the probability of blocking, data transfer rate, average waiting time, etc. Markov renewal theorems are used as analytical tool for studying the limiting behavior of a semi-Markov process that depends on a small parameter $$\varepsilon >0$$ ε > 0 at $$\varepsilon \to 0, t\to \infty , \varepsilon t\to u$$ ε → 0 , t → ∞ , ε t → u . This theorem makes it possible to study transient phenomena arising in telecommunication systems, such as the asymptotic behavior of additive functionals from ergodic processes, boundary value problems for random walks and processes with independent increments, branched processes close to critical ones, and others. Analyzing the limit behavior of a semi-Markov process with respect to a small parameter is a valuable tool for understanding and optimizing telecommunication systems, ensuring their reliability, and meeting performance requirements.

Sergii Degtyar, Oleh Kopiika, Yurii Shusharin
Mathematical Modeling of Cardiac Cycle Dynamics

A system of differential equations was found that describes the dynamics of the cardiac cycle. The dependence of the probability of finding the heart in the states of systole (systole) and diastole (diastole) on time and diagnostic parameters of the cardiac cycle is determined. Considered examples that illustrate the broad possibilities of applying the method of differential equations for the used dynamics of the cardiac cycle “normally” and in the presence of diseases. The initial data for solving the Cauchy problem are $$R \to R$$ R → R the duration $$T_{0}$$ T 0 of the cardiac cycle and the systolic index of electromechanical activity of the heart $$K$$ K , determined by the electrocardiogram and Buzzet's formula.The obtained results can be useful to specialists who are professionally engaged in diagnosing cardiovascular systems and processing electrocardiograms. When evaluating the daily parameters of Holter heart rhythm monitoring, the automated analysis of electrocardiograms allows you to verify heart rhythm parameters under the condition of ensuring the minimum full probability of a diagnostic error. In this case, the minimum full probability of a correct diagnosis is ensured with limited resources spent on studying and processing the electrocardiogram. To solve the problem of verifying the parameters of the heart, an approach to solving the problem of mathematical modeling of the dynamics of the cardiac cycle is proposed by using the method of comparisons and solving differential equations, which are a formal description of the change in the state of the heart during the cardiac cycle, and the detection of integral diagnostic parameters.

Viktor Semko, Oleksiy Semko
Using Mathematical Simulation to Improve the Selective Laser Melting Process for 3D Printing

This paper presents an overview of the physical processes occurring during selective laser melting of metal powders, as well as a review of methods for mathematical simulation of these processes. The paper identifies the main physical processes that should be considered for adequate simulation of selective laser melting of metal powders. A general physical and mathematical model for solving the problem of selective laser melting of metal powders is proposed. It describes also the assumptions and restrictions for the considered physical model. Furthermore, it considers the initial equations describing the processes of heat transfer, liquid metal dynamics, and the shape of the free surface, along with the initial and boundary conditions. The numerical algorithm for solving the initial equations is based on a three-layer implicit scheme with second-order time integration accuracy, third-order counterflow approximation of convective terms, and second-order central difference approximation of diffusion terms. The pressure and velocity fields in the Navier-Stokes equations were coupled using the artificial compressibility method, modified for solving nonstationary problems. The system of initial equations was integrated numerically using the finite volume method. The numerical simulation of non-stationary processes of selective laser melting of metal powders provided a temperature distribution in the computational domain, including the indicating of the liquidus zone. The simulation enabled to determine the dependencies of the width and depth of the melt pool on the laser speed and the laser spot diameter.

Dmytro Redchyts, Stanislav Dovgyi, Uliana Tuchyna, Svitlana Moiseienko
Language Technologies Before, During and After the GPT Revolution: Evolutionary Approach

The state of affairs in modern information technology convincingly shows that the main vectors and efforts of leading players are now concentrated on the development and implementation of artificial intelligence methods with an unambiguous focus on natural language mechanisms. The development and implementation of intellectual artifacts based on the Large Language Models, which are the basis for hopes of overcoming the global crises of the World Order, have strengthened the professional community’s belief that artificial intelligence is a form of individualization of systems that has a linguistic status. The above stimulates the revision of the methodology of linguistic research, which involves a significant expansion of the phenomenological base and modification of the conceptual tools and cognitive practices within the functional decomposition, which the authors called the “Main Cognitive Path”, the elements of which are: < Perception—Sensation—Experience (Emotion)—Awareness—Understanding—Reflection—Reaction >. At the same time, the volume and intensity of verbal communication of the mass of individuals forming a community becomes a decisive factor in evolutionary processes. The creation of a sovereign communication environment for intellectual artifacts should complete the formation of an analog of the communication mechanism for intellectual artifacts. Only then will it be possible for artificial intelligence to further evolve and enter the world stage as an independent entity, which provokes a sharp reaction from human communities in view of the serious dangers to the human race. That integrates biological and technotronic substrates into a single cognitive organism.

Volodymyr Shyrokov, Maksym Nadutenko, Oleksandr Stryzhak
Generative AI as a Cultural Phenomenon in the Digital Transformation of Education

The paper explores the transformative impact of Generative AI on education, emphasizing a comprehensive cultural shift. A three-dimensional model of culture, encompassing spiritual, social, and technological components, serves as a framework for analyzing this shift. Traditionally, education has emphasized social culture, reinforcing societal norms. However, the rise of Generative AI introduces a need to balance this with spiritual and technological components. Spiritual culture, defined as ‘Existential Self-Awareness,’ fosters creative freedom, allowing learners to explore their identity and values. Social culture is undergoing significant changes due to ‘Generative Socialization,’ as AI agents like chatbots create new interactions between teachers, students, and AI, fostering novel forms of collaboration and communication. Technological culture, once merely instrumental, has now become epistemological, generating new knowledge through human-AI interaction, described as ‘Epistemological Technological Culture.’ Using the example of the constructionist approach to learning, this paper demonstrates how Generative AI is changing educational practices, providing opportunities for personalized learning and transforming the traditional dynamics of classrooms. This shift compels educators and students to adapt to a new educational environment where AI becomes a collaborative partner, fundamentally rethinking roles, the process of knowledge acquisition, and human self-perception within the educational system. This study provides a framework for understanding and adapting to the transformative potential of Generative AI in education.

Ilya Levin, Konstantin Minyar-Beloruchev, Michal Marom
Application of Digital Method for Processing Distributed Digital Linguistic Resources

This article addresses the challenge of effectively processing and utilizing distributed digital linguistic resources to enhance the teaching of philological disciplines. Current efforts emphasize the application of digital methods, including statistical, corpus-based, and lexicographic techniques, integrated with artificial intelligence (AI) tools, such as machine learning and neural networks. The proposed solution centers on the development of a virtual lexicographic laboratory, the “Multimedia Dictionary of Infomedia Literacy,” which offers an innovative digital approach to language education, linguistic research, and telecommunication applications. The primary idea is to employ these digital and AI-driven methods to analyze and organize vast amounts of linguistic data, making them accessible and useful for educational purposes. This approach has been tested in educational environments, demonstrating effectiveness in advancing academic resilience, particularly among doctoral students engaged in philological research. Key results indicate that the digital methods facilitate deeper engagement with language content, offering tools that adapt to individual learning needs and enhance the processing of large datasets. Case studies, including the Science4Brave Cluster, highlight the role of these tools in fostering critical thinking and academic resilience in times of crisis. Significantly, the laboratory’s tools have shown value in telecommunication by supporting the transmission and analysis of linguistic information, adapting content for efficient communication across digital platforms. This work underscores the potential of digital and AI-enhanced methods in processing linguistic resources, advancing language learning, and contributing to the broader discourse on the role of technology in modern linguistics.

Maksym Nadutenko, Margaryta Nadutenko, Olena Semenog, Olha Fast
Backmatter
Metadaten
Titel
Applied Innovations in Information and Communication Technology
herausgegeben von
Stanislav Dovgyi
Eduard Siemens
Larysa Globa
Oleh Kopiika
Oleksandr Stryzhak
Copyright-Jahr
2025
Electronic ISBN
978-3-031-89296-7
Print ISBN
978-3-031-89295-0
DOI
https://doi.org/10.1007/978-3-031-89296-7