Skip to main content

2022 | Buch

ICTERI 2021 Workshops

ITER, MROL, RMSEBT, TheRMIT, UNLP 2021, Kherson, Ukraine, September 28–October 2, 2021, Proceedings

herausgegeben von: Oleksii Ignatenko, Vyacheslav Kharchenko, Vitaliy Kobets, Hennadiy Kravtsov, Yulia Tarasich, Vadim Ermolayev, David Esteban, Vitaliy Yakovyna, Aleksander Spivakovsky

Verlag: Springer International Publishing

Buchreihe : Communications in Computer and Information Science

insite
SUCHEN

Über dieses Buch

This book contains the workshops papers presented at the 17th International Conference on Information and Communication Technologies in Education, Research, and Industrial Applications, ICTERI 2021, held in Kherson, Ukraine, in September-October 2021.
The 33 revised full papers and 4 short papers included in this volume were carefully reviewed and selected from 105 initial submissions. The papers are organized according to the following workshops: ​9th International Workshop on Information Technology in Economic Research (ITER 2021); 5th International Workshop on Methods, Resources and Technologies for Open Learning and Research (MROL 2021); International Workshop RMSEBT 2021: Rigorous Methods in Software Engineering and Blockchain Technologies; 7th International Workshop on Theory of Reliability and Markov Modeling for Information Technologies (TheRMIT 2021); 1st Ukrainian Natural Language Processing Workshop (UNLP 2021).

Inhaltsverzeichnis

Frontmatter

9th International Workshop on Information Technology in Economic Research (ITER 2021)

Frontmatter
Automated Forming of Insurance Premium for Different Risk Attitude Investment Portfolio Using Robo-Advisor

The volume of private investment is growing steadily nowadays. In this case, it is crucial to analyze investors’ behaviour, decision-making factors and the specifics of their investment portfolio formation and especially their cognitive constraints, which prevent them from effectively defining investment goals and profitable achieving them. This research shows that investors, even experienced and financially literate, often make significant mistakes when creating their own investment portfolios. Thus, the use of automated tools for determining the insurance premium and the optimal investment portfolio, which is a robo-adviser, becomes relevant. The paper presents the model for estimating personal insurance premium for different risk attitude investment portfolios using robo-advisor. Three types of investors are analyzed: conservative, aggressive, and moderately aggressive. The model helps determine the individual size of the insurance premium for each investor profile, taking into account his or her risk attitude.

Vitaliy Kobets, Valeria Yatsenko, Ihor Popovych
An Approach to Business Process Model Structuredness Analysis: Errors Detection and Cost-Saving Estimation

This paper considers business process model structuredness issues, which are mostly related to inaccurate usage of gateways. According to related work in the process model structuredness domain, split gateways ought to match respective join gateways of the same type, while the existing mismatch measure allows evaluating model structuredness only by degrees of split and join gateways. Thus, the current measure of process model structuredness is not accurate enough and process model shortcomings may remain undetected, which may affect negatively model understandability, maintainability, and increase the error probability of business process models. Hence, error fixing costs may grow exponentially during later stages of the information system lifecycle. Therefore, we have proposed an improved gateway mismatch measure and a model to detect design issues and suggest changes necessary to achieve a sufficient level of business process model structuredness. The software tool for business process model structuredness analysis was developed to perform experiments with a large set of business process models of different industries. Analysis of obtained results, including sample business process models, detected design issues, and estimated efforts and cost-saving benefits are outlined. Conclusions were made, and future work was formulated.

Dmytro Orlovskyi, Andrii Kopp
From Deutschland AG to World, Inc.? Network Analysis of the Capital Linkages of German Listed Companies

This paper investigates whether the global trend of ownership con-centration of international financial institutions can also be observed for the Deutschland AG, which is the informal designation for the historically grown and largely isolated network of German stock listed companies. Using network analysis, capital linkages of German HDAX and SDAX companies in both 2006 and 2018 are analyzed and the results are compared. The network analysis enables a systematic presentation of the capital linkages and also helps to analyze link strengths and make supposedly hidden relationships visible. This has been made possible in recent years by the further development of powerful IT hardware and the development of corresponding network analysis software. The results show a noticeable increase in the concentration of internationally active investment companies in the ownership structures with a simultaneous decline in the participation rates of German investors in German companies. Therefore, both the trend of ownership concentration of international financial institutions and the erosion of the Deutschland AG can be confirmed.

Jan-Hendrik Meier, Philip Schüller, Florian Behling
The Development of Investment Projects Attractiveness Assessment Software

Paper describes the creation of software for automated assessment of projects’ investment attractiveness. Research goal is development of software for automated assessment of investment attractiveness of projects using investment criteria. Research methods include algorithms for assessing the investment attractiveness of projects using defined criteria, software development on the BAS ERP platform using finance criteria. To integrate these criteria scoring system for selection of investment projects was implemented. Comparative analysis of the software for assessing the investment effectiveness of projects was carried out, the main criteria of assessing the investment attractiveness of projects were determined, and a software tool for automated analysis of the investment attractiveness of projects was developed to evaluate different projects taking into account definite criteria.

Vitaliy Kobets, Oleksandr Berehovyi
Designing an Algorithm for Capturing Price Volatility Factors in the Stock Market

The research objective is to design an algorithm to be used to identify factors influencing price dynamics of shares at the time of publication of quarterly reports.Subject of research: design forecast algorithms for trade platforms for stock exchanges.Research methods used: comparative analysis, business analytics, design of software module, correlation regression analysis, analytical methods.Results of the research: The paper contributes to the empirical studies of the influence of interim disclosures of companies on equity price fluctuation and to the practical design of algorithms for trade forecasting platforms. The key finding is that the highest incidence of price dynamics was observed during the period of quarterly reports. The largest impact on price movements during the quarterly reporting period was made by the number of open short positions and capitalization of companies.

Liubov Pankratova, Tetiana Paientko, Yaroslav Lysenko
The Prediction of Leadership Degree Based on Machine Learning

While the evolution of digital technologies in human-related aspects changes the approach to organisational issues, artificial intelligence enables complex decision-making and supports strategically important evaluations. Management behaviour, decisions and activities at all organisational levels cause consequences of varying degrees. Recent developments across management related processes require a paradigm shift regarding the application of assistive technologies.Leadership as a phenomenon is multifarious. In our exploration, we limit our scope of investigation to the degree of leadership as one of the decisive components for successful entrepreneurship and ranks as one of the organisational development indicators. The purpose of our study comprised the formulation of an organisational leadership model and simulation of leadership degree based on given parameter data for ensuring valid prediction and associated prevention of leadership misbehaviour and faulty decisions.In this paper, we conduct a feasibility study and provide a solution for predicting the degree of leadership. First, we discuss the current situation exhibited in publications on researched topic relying on qualitative content analysis, further we apply the method of scientific abstraction and modelling of the leadership degree. Further, we propose a holistic approach to predict the defined leadership parameters over time, based on a regression decision tree model. In order to evaluate our proposed approach, we present selected implementation example pursuing the identified goals of analysis. Subsequently, we discuss the proposed approach with a focus on the potential benefits, obstacles, limitations and perspectives.

Olena Skrynnyk, Tetiana Vasylieva
Aligning Higher Education in Ukraine with the Demands for Data Science Workforce

Accelerated technological development in the context of the Fourth Industrial Revolution changed the nature of competition in world markets, increasing the importance of technological opportunities as a source of competitive advantage and identifying technology as a key factor in production. Every year, digital technologies change everyday life, creating the foundations for sustainable socio-economic development. Changes resulting from the revolution in information technology signal the need for new approaches to training, particularly in Ukraine. Technologies are improving at a fairly rapid pace, but the methodological base at the level of Ukrainian high education institutions (HEIs) is adapting to such changes rather slowly, which, accordingly, slows down the process of “smarting” of education. In turn, graduates are not the most attractive for the modern labour market. This article highlights the urgent need for extensive training in this area. In turn, the paper aims to offer a case of the study programme for graduating data science analysts (DSAs). The original approach is the master degree programme case for the social science faculty but not for engineering faculty as it is traditionally. The necessity of DSAs is extremely high in the economic field/business however mostly graduates of the engineering faculties having strong programming skills lack the economic knowledge and understanding of business laws. The contribution of the paper is that the proposed program differs from existing ones on the market, but is not implemented in HEIs, with its systematical adaptability to the requirements of the state standard; as well it meets all the requirements of employers in the field of Data Science. The paper is mostly in the practical and descriptive area thus the methodological base of the research are general scientific research methods like historical method, comparative analysis, methods of analyses & synthesis, system approach and logical generalization.

Ganna Kharlamova, Andriy Stavytskyy, Olena Komendant
Implementing ERP Simulation Games in Economic Education: Ukrainian Dimension

The effectiveness of the introduction of interactive teaching methods, such as simulation games in ERP-systems, in the educational process appeared to be promising both in traditional and online learning, thus, the global expertise in this sphere should be learned and shared in order to increase the competitiveness of Ukrainian education. The aim of this article is to develop models for applying simulation games in ERP-systems in order to exploit multidisciplinary educational opportunities in economic educational programs and to enable students of Ukrainian HEIs to master professional competencies both in face-to face and online learning. This study is focused on showing the models for applying the approach in the framework of the courses related to quantitative methods of decision-making support. The implementation of the proposed idea is presented in terms of Economic Cybernetics educational program. Two models of ERPsim implementation in the educational process are suggested, the advantages of each model are analyzed. Examples of the use of data from ERP systems for introducing the quantitative methods of decision-making support in such courses as Operations Research in Economics and Methods of Decision Justification in Economics under Information Uncertainty are given.

Galyna Chornous, Oksana Banna, Iryna Fedorenko, Iryna Didenko
Data Mining Method Application to Grain Export and Exchange Rates Co-movement Under Incomplete Information

International commodity, financial and foreign exchange markets operate under conditions of uncertainty, incomplete or asymmetric information. Consequently, exchange rate dynamics is nonlinear and does not fit fundamental economic factors, and therefore cannot be sufficiently predictable. In addition, the export of goods depends on different factors that represent multidimensional data sets. Such complexity requires applying sophisticated analytical tools such as Data Mining to analyse and make reasonable management decisions under dynamic information. The paper investigates the relationship between grain export revenues and the exchange rate. The analysis showed that the fundamental assumptions about an increase in export revenues due to the devaluation of the national currency during the periods of the most significant depreciation of the hryvnia do not come true. Then, indicators that describe the macroeconomic situation of importing countries and the world markets conditions play a more significant role. It helps to identify the significant factors influencing export earnings in different periods to use adaptive management methods at different stages of the economic cycle.

Volodymyr Shevchenko, Valeria Yatsenko
Cluster Analysis of Ukrainian Regions Regarding the Level of Investment Attractiveness in Tourism

The article contains a description of the process and results of the implementation of the k-means algorithm in the analytical platform Loginom for the problem of clustering the regions of Ukraine by the level of investment attractiveness in the field of tourism. The selection of tourism clusters and their ranking is a difficult task in the field of data analysis, as there is no single consolidated indicator of investment attractiveness. The conclusion about the affiliation of a particular region to one of the tourist clusters is determined by a set of indicators of the volume of tourist services for different types of economic activity in the field of tourism. The Loginom system has powerful tools for cluster analysis using EM-Clustering, k-means, g-means and others. The tools of statistical and visual analysis of the obtained results deserve special attention: Table, Statistics, Chart, OLAP-Cube, Cluster Profiles. Clustering has made it possible to identify groups of regions that are actively developing the tourism industry (primarily Kyiv city and Odesa region) and are currently formed for tourism investors. Equally important is the selection of problem regions that have a low level of attractiveness for domestic and foreign tourism. It is noted that Ukraine has a huge potential for the development of the tourism industry. The regions that, according to the results of the cluster analysis, are in the problem group have “world-class tourist pearls”. The Government of Ukraine and local authorities should pay attention to the insufficient level of development of the tourism industry, provide comprehensive support to the regions that are in the problem cluster, and thus increase their level of investment attractiveness.

Ganna Kharlamova, Andrii Roskladka, Nataliia Roskladka, Andriy Stavytskyy, Yuliia Zabaldina
An Intelligent Method for Forming the Advertising Content of Higher Education Institutions Based on Semantic Analysis

Advertising is a unique socio-cultural phenomenon: its formation is due to social, psychological, linguistic factors, features of the “aesthetic consciousness” of society and its cultural traditions. Advertising text – a special kind of text, it is a carrier and expression of information. It is important that the text is interesting to the desired audience, and it can be formed by methods of text semantic analysis and highlight the keywords on the basis of which advertising content can be generated. In this regard, it is developed an intelligent method of advertising content forming of higher education institutions on the semantic analysis basis and thus advertising manager can generate advertising content. The implementation of the method was carried out on the basis of a survey of “Computer Science” students regarding admission. Semantic analysis of documents based on LSA and LDA-method is performed. The results show that more than six keywords are present in document 0, based on the LSA method – 66%. Based on the LDA method, the vast majority of keywords are presented in document 2 – 82%. Based on the obtained keywords, the LSA and LDA methods created content for advertising of higher education institutions. The effectiveness of the generated advertising content on the basis of LSA and LDA-method was compared, a comparative experiment was conducted on Facebook on the business page “Computer Science of ZUNU”. By effectiveness comparing results of generated advertising content, the effectiveness of the ad increased by 44% and the price for the result decreased by 31%.

Khrystyna Lipianina-Honcharenko, Taras Lendiuk, Anatoliy Sachenko, Oleksandr Osolinskyi, Diana Zahorodnia, Myroslav Komar
Correlational and Non-extensive Nature of Carbon Dioxide Pricing Market

In this paper, at the first time, the analysis of correlational and non-extensive properties of the $$\text {CO}_{2}$$ CO 2 emission market relying on the carbon emissions futures time series for the period 04.07.2008–10.05.2021 is performed, and the daily data of the power sector from the U.S. Carbon Monitor for the period 01.01.2019–10.05.2021, which consist the data of both individual countries (USA, Germany, China, India, United Kingdom, et al.) and global emissions (World) are investigated using such approach. To demonstrate the applicability of these methods on systems of another nature and complexity, the analysis of the Dow Jones Industrial Average (DJIA) index is presented. The results show that both futures and the DJIA are presented to be non-extensive, and the distribution of their normalized returns can be better described by power-law probability distributions, particularly, by $$q$$ q -Gaussian. Tsallis triplet for the entire time series of $$\text {CO}_{2}$$ CO 2 emissions futures and the DJIA is estimated, and $$q$$ q -triplet as an indicator of crisis phenomena is presented, relying on the sliding window algorithm. It can be seen that the triplet behaves characteristically during economic crises. This study shows that the toolkit of the random matrix theory (RMT) allows to investigate the correlational nature of the carbon emissions market and to build appropriate indicators of crisis phenomena, which clearly reflect the collective dynamics of the entire research base during events of this kind.

Andrii O. Bielinskyi, Andriy V. Matviychuk, Oleksandr A. Serdyuk, Serhiy O. Semerikov, Victoria V. Solovieva, Vladimir N. Soloviev
Developing an Algorithm for the Management of Local Government Expenditures

The research objective is to is to develop an algorithm to computerise the process of allocating the limited resources of a local government to maximise the needs of the community. With limited financial resources, local governments must determine the optimum volume of planned services to be provided. The increasing amount of information, as well as the need for rapid management decision-making, necessitates the use of information and computer technologies (ICT) in this area.Research methods used: comparative analysis, planning theory, utility analysis, design of software module, analytical methods.Results of the research: The paper contributes to the theoretical studies about ICT implementation in local governance. Also, the paper contributes to the discussion of the practical implementation of ICT in the allocation of limited resources at the local governance level.

Andrii Buriachenko, Tetiana Paientko
Development of Robo-Advisor System for Personalized Investment and Insurance Portfolio Generation

We researched how to use financial technology in the finance industry on the example of robo-advisors; defined the basic functionality of robo-advisor; got the robo-advisors implementation based on analysis of the most popular financial services. We compared their functionality, composed a list of critical features and described the high-level architectural design of a general robo-advisor tool, scope of application of robo-advisors, their key features, and a brief overview of existing solutions. Using Markowitz model, we set up a concept of a robo-advisor application for investors who have different attitudes towards risks. Our goal is to cover the main features of financial robo-advisor and to describe a high-level architecture for such applications. We have defined the main modules that represent the architecture of a typical robo-advisor. We also described different techniques, which could be applied for building a personalized investment and insurance portfolio.

Serhii Savchenko, Vitaliy Kobets

5th International Workshop on Methods, Resources and Technologies for Open Learning and Research (MROL 2021)

Frontmatter
Using Virtual Reality Technologies for Teaching Computer Science at Secondary School

The article reveals the problem of using virtual reality technologies for teaching computer science at secondary school as an innovative technology. Some ways of using virtual reality technologies for computer science education were analyzed and researched. Accordingly, we have developed virtual reality software for teaching students spreadsheets, algorithms and programming. Evaluation of the virtual reality tools use effectiveness for teaching computer science at secondary school and identifying ways to improve the components of the scientific and methodological system of teaching computer science were carried out using the methods of mathematical statistics. The innovative nature of the virtual reality technologies use is a methodological condition for increasing students’ interest in learning, intensifying the learning process, revealing their cognitive, intellectual and personal potential, as well as the development of digital competencies, independence and more. The authors emphasize that the use of virtual reality technologies requires a thorough development of the virtual learning environment, detailed selection of learning content and its adaptation to specific groups of students. Based on the analysis of the obtained results, possible ways of effective system implementation of virtual reality technologies for teaching computer science at secondary school are identified, which includes selection of methods, digital tools, adaptation of theoretical material, development of individualized and creative tasks.

Oksana V. Klochko, Svitlana V. Tkachenko, Iryna M. Babiichuk, Vasyl M. Fedorets, Tetiana V. Galych
The Benefits of Using Immersive Technologies at General School

The article is devoted to the problem and benefits of the use immersive technologies at general school. Immersive technologies are an important tools for future improving the educational process, and important to study their impact on the mental and physical condition of young people, which will depend on the effectiveness of the use of virtual reality in education at various levels. The purpose of the article is to analyze the use of immersive technologies for school learning and to identify the basic benefits of using immersive technologies at general school. We made a comparative description of traditional earning and learning using immersive technologies. WE can highlight the main advantages of immersive technologies: providing the ability to change the relative size of the pre-investigated objects, which leads to the visualization of objects of the micro and macro world; creation of models of the phenomena or processes which cannot be directly and clearly registered by bodies of senses of the person; visualization of abstract models and production of objects that have no form in the real world; focusing students on the study of specific objects without distraction to external stimuli, which gives them the opportunity to focus fully on the material.

Nataliia V. Soroko, Svitlana H. Lytvynova
Use of VR Technologies in the System of Quality Assessment of Seafarer’s Professional Competence Formation

The article presents the specific features of using VR technology in the training of future marine specialists on the example of the formation of professional competence to ensure safe navigation through the use of information provided by navigation equipment and systems that support the decision-making process. The complexity of assessing constantly changing navigational circumstances, the complexity of Electronic Chart Display and Information System (ECDIS) interface - require the navigator to respond to these changes quickly. Thus, in the process of preparing future seamen, it is necessary to create models of future professional actions that are as close to reality as possible. It has been proven that the use of VR technology increases the learning efficiency of educational material for ship control dynamic systems. The requirements for the software and hardware of the proposed model for the formation of navigator’s professional competencies are formulated. At the beginning of the research, there were created two subgroups of applicants who had the same level of training. The experimental subgroup was engaged in training evasive maneuvers using VR technology and, as a result, most of the applicants learned to perform this task safely and quickly, compared to classical subgroup, under the given weather conditions and channel geometry, compared to the classical one. The article also provides algorithms for the software implementation of preparing and monitoring of proposed VR-system exercises, as a component of the model for the formation of navigator’s professional competencies.

Serhii Voloshynov, Andrii Petrovskyi, Halyna Popova, Vasyl Cherniavskyi, Tamara Pindosova
The Experience of Using Cloud Labs in Teaching Linux Operating System

The paper is devoted to the problem of studying the Linux operating system (OS) by future computer science teachers. The authors have developed cloud laboratory CL-OS. Based on an analysis of related research, they concluded that a cloud lab is a system that uses cloud computing technology to provide the desired user interface to access computing resources. The authors have systematized some experience of using mass open online courses (MOOC) and remote access laboratories. They made assumptions about the possibility and necessity of creating cloud laboratories in universities and colleges. This article described the experience of deploying an authors’ cloud laboratory. It is based on integrating the NDG Linux Essentials course from Cisco Network Academy and the Apache CloudStack cloud platform. The authors have analyzed the specifics of training under the conditions of using their cloud laboratory. In particular, some limitations and problems of students’ work under the conditions of the real educational process (both face-to-face and online) were identified. A pedagogical experiment was conducted to confirm the effectiveness of the author’s approaches. Its results were verified using statistical methods.

Vasyl Oleksiuk, Oleg Spirin
Predicting Students’ Academic Performance Based on the Cluster Analysis Method

A mathematical model for assessing student performance based on cluster analysis has been developed. Assessment is based on two stages: assessment of applicants to identify patterns between the results of external independent testing and assessment based on the results of current assessments and exams of students. The developed model gives an idea of how well the student will study in the areas of disciplines that are related to external independent testing. And also, about how well the student will study, based on the results of intermediate certifications and sessions. This, in turn, makes it possible to draw attention to specific students who may have problems in a number of disciplines and topics. And also, on the contrary, to identify students that have a clear inclination to discipline and it may be worth deepening this knowledge. Thus, allowing the introduction of an individual training vector. Experimental studies have been carried out to confirm the effectiveness of the developed model for the first and second stages of assessment. The developed model based on cluster analysis makes it possible to use it in the future to analyze and predict the progress of students of any higher educational institution in Ukraine.

Olha Pronina, Olena Piatykop
Telegram Messenger for Supporting Educational Process Under the Conditions of Quarantine Restrictions

The article is devoted to the problem of using Telegram messenger in the educational process in supporting the higher school under the conditions of forced quarantine restrictions due to the Covid-19 pandemic. Modern messengers (Telegram, Viber, Facebook Messenger, WhatsApp) are analyzed via pre-defined criteria (commerciality, functionality, architecture, security) and the range of indicators, the benefits of using Telegram messenger to support the educational process are outlined. The essence, benefits, and possibilities of using Telegram ChatBot are characterized. The main steps for creating Telegram ChatBot are described. Recommendations are provided to improve the pedagogical effect of using Telegram ChatBots. The analysis of results of educational interaction via Telegram messenger is presented (112 students’ survey, evaluation of learning outcomes in experiment (112 students) and control (110 students) groups). The empirical research has shown that Telegram messenger allowed supporting the educational process in the conditions of unexpected and forced quarantine restrictions, with no loss of qualitative indicators, achieving the pedagogical goals.

Oleksandr Nosenko, Yuliia Nosenko, Roman Shevchuk
Designing a Tool for Economics Students Digital Competence Measurement

The article is devoted to the essence, approaches, and tools for the development and measurement of economics students’ digital competence. Apps of general and professional purpose according to their functionality and relevance for Ukrainian enterprises are generalized. The importance of the ability to select apps for a rational solution of professional tasks and the development of the digital competence of specialists are emphasized.The results of the survey of students of the Poltava State Agrarian University (Ukraine) are presented, to determine the main problems, advantages, and needs in the development of their digital competence. Analysis of foreign approaches (DigComp, DigCompEdu, DigCompOrg, OpenEdu, DigCompConsumers, EntreComp, etc.) contributed to the creation of a framework for economics students’ digital competence measurement. The framework covers the abilities (working with data, communication, content development, safety, problem-solving), indicators, and levels of digital competence development. The expert evaluation revealed that most experts (93,4%) evaluate the proposed project positively, and it can serve as a basis for developing narrower frameworks for specific economic or related specialties.

Liudmyla Dorohan-Pisarenko, Oleksandr Bezkrovnyi, Tetiana Pryidak, Olha Leha, Liudmyla Yaloveha, Olena Krasota
About Electronic Textbook “Mathematical Tasks Programming. First Steps”

The goal of knowledge quality improving of secondary school students in the field of exact sciences determines the relevance of creating interactive electronic educational resources for tasks solving of mathematical models programming. The paper presents the electronic textbook description, its content and structure, model, as well as design and software development technology. It is intended to improve qualitatively the preparation of high school students and junior university students in mathematics and programming. In particular, it can be used as a textbook at elective classes and young programmers’ clubs, as well as in the process of individual work to train schoolchildren and students for programming Olympiads. The content of the electronic textbook is based on a set of mathematical tasks. Each task contains a statement of the task, instructions for its solution and an algorithm for solving it, implemented in the form of program code in Pascal, C/C++ and Python. The model of learning system using an electronic training manual is described. The e-tutorial software is a Web application and is built on client-server architecture. The basis of the Web-application is the “Textbook” software module, which contains a complex of mathematical tasks with the author’s algorithms of solving. The web application contains a software module for expanding the electronic textbook content in the form of a built-in editor for mathematical tasks. A qualified user of the system has the opportunity to develop new tasks within the existing textbook format with a description of the algorithm for their solution. The automatic system of checking the proposed algorithm for task solving provides the basis for an expert opinion on placing a new task in the library for further use. The approbation of the Web-application in the educational process was carried out in secondary educational institutions of Kherson, and it received a positive assessment from students and teachers.

Michael Lvov, Hennadiy Kravtsov, Ludmila Shishko, Olha Hniedkova, Irina Chernenko, Evgen Kozlovsky

International Workshop RMSEBT 2021: Rigorous Methods in Software Engineering and Blockchain Technologies

Frontmatter
Algebraic Virtual Machine Project

This paper presents a program system called an algebraic virtual machine (AVM), which handles industrial hardware specifications, programs in different languages, and models in algebraic language. It uses the formal algebraic methods that were developed in the scope of behavior algebra and help to resolve the problems of verification, analysis, testing, and cybersecurity. It permits the possibility of creating your own methods and theories and trying them with industrial examples with minimal efforts. The machine learning technique is used for the definition of formal method efficiency, and the classification model is trained during algebraic processing. The formalization and checking for resistance of blockchain attack is considered.

Oleksandr Letychevskyi, Volodymyr Peschanenko, Vladislav Volkov
Quantization in Neural Networks

Like the human brain, an artificial neural network is a complex nonlinear parallel processor; it is often called a neurocomputer. Accordingly, mathematical models of a neural network are usually continuous and stochastic, naturally associated with fuzzy logic. Classical systems of artificial intelligence are always naturally associated with classical logic and discrete mathematics. Thus, the representations and models of knowledge, undeniable at least since Aristotle, do not correspond to the cognitive models that are obtained as a result of studying the human brain. In view of Niels Bohr, quantization is a phenomenon of a discrete, sequential process, that inherent in continuous and stochastic systems. However, the traditional mathematical model of quantum mechanics did not imply generalization to dissipative systems. The corresponding generalization, called the Dynamic quantum model (DQM), was proposed by author. It is defined for any dynamic system, given by ordinary differential equation or by diffeomorphism, or for dynamic systems that using logical operations. The neural network is exactly the DQM in the space of input signals. In this paper DQM is defined and constructed universally for both Hamiltonian systems and systems with the fuzzy logic truth function on phase space. The paper goal is to demonstrate quantization on DQM, i.e. actually on neural networks, and to extend the classical Bohr-Sommerfeld condition to the general case, in particular, to systems with a fuzzy truth function.

Alexander Weissblut
Algebraic Modeling of Molecular Interactions

The approach of algebraic modeling of molecular interactions in some environment to determine the triggering of the studied properties is considered in the present article. The main idea of this study is to represent the actions of elementary particles in different molecular structures, in particular the motion of electrons in orbitals as algebraic equations for further processing. Behavior algebra specifications are used as the modeling language. The article also describes the formalization of the examples of atoms interaction (creating of chemical bonds) on the example of Ionic Bond. The formalization and properties analysis is considered with the usage of the insertion modeling platform. The study is at an early stage of development and the approach is demonstrated by some examples.

Oleksandr Letychevskyi, Yuliia Tarasich, Volodymyr Peschanenko, Vladislav Volkov, Hanna Sokolova

7th International Workshop on Theory of Reliability and Markov Modeling for Information Technologies (TheRMIT 2021)

Frontmatter
Collaborative-Factor Models of Decision-Making by Operators of the Air Navigation System in Conflict or Emergency Situations

The authors present a new approach to conflict management to ensure proper collaboration between different aviation personnel using decision-making methods in uncertainty. To improve the results of collective decisions, a dual risk assessment of decision-making in an emergency is used. Initially, the operators’ decision is influenced by the factors of occurrence and development of an emergency. The next collective matrix is formed from the individual rational decisions of the operators. The reliability and optimality of the result solutions are provided by the individuals and collaborative solutions of operators. The optimal solutions in emergency “Failure of one engine on a twin-engine aircraft” using collaborative-factor decision-making models for the pilot, flight dispatcher, and air traffic controller are obtained.

Tetiana Shmelova, Maxim Yatsko, Yuliya Sikirda
DNS-Based Fast-Flux Botnet Detection Approach

Today the problem of botnets detection is very actual, as botnet are widespread and are used to perform different types of cyberattacks and to cause threats to network services and users’ properties. One of the mean the botnets use to connect with their command-and-control (C&C) is the domain name system (DNS). On other hand the fast-flux technique enables to avoid botnets’ detection. The pa-per presents a new botnets’ detection technique, which takes into account the DNS feature analysis, botnets’ architecture aspects, as well as theirs behaviors in the network and hosts. Proposed approach allows detecting the botnets’ bots of centralized, decentralized and hybrid architecture with high efficiency.

Sergii Lysenko, Kira Bobrovnikova, Piotr Gaj, Oleg Savenko
Checkable FPGA-Based Components of Safety-Related Systems

The paper is devoted to the problem of checkability of circuits in FPGA components of safety-related systems, which are designed to operate in two modes: normal and emergency for providing their own functional safety and the safety of control facilities in order to prevent accidents and reduce losses in case of their occurrence. Functional safety is ensured through the use of fault-tolerant solutions that are sensitive to sources of multiple failures, including hidden faults. They can accumulate during a prolonged normal mode with limited checkability of the circuits and simultaneously manifest themselves with the beginning of the emergency mode. A fault-tolerant structure becomes fail-safe if it is checkable. The problem of hidden faults manifests itself in the memory of the LUT units of FPGA components with LUT-oriented architecture. The program code written in the memory of the LUT units is checked with a checksum, but it can be corrupted when reading its bits on the outputs of the LUT units. Bits observed only in emergency mode reduce the checkability of FPGA components and are potentially hazardous. Checkability can be increased by the operation of circuits on successively replaced versions of the program code that can be obtained for the same hardware implementation. Versions move potentially hazardous bits to checkable positions observed in normal mode. However, the set of these versions are significantly limited by connecting the inputs of the LUT units to the inputs of the FPGA component. The proposed method overcomes this limitation by introducing an additional scheme. Experimental studies of library FPGA designs show a low level of their checkability and efficiency of the proposed method, which provides totally checkable circuits.

Oleksandr Drozd, Kostiantyn Zashcholkin, Anatoliy Sachenko, Oleksandr Martynyuk, Olena Ivanova, Julia Drozd
UAV Fleet Routing with Battery Recharging for Nuclear Power Plant Monitoring Considering UAV Failures

Reliability-based unmanned aerial vehicle (UAV) fleet nuclear power plant (NPP) monitoring mission planning models with battery recharging are developed. Battery recharging is carried out either at the depot or by using autonomous battery maintenance stations (ABMSs) deployed at certain points. A classification and set of the models in accordance with ways for UAVs to follow their routes and recharge their batteries are suggested. Examples of the proposed models application are given. The probability of the successful fulfillment of the plan for the UAV fleet to perform the NPP and other critical infrastructures monitoring mission is used as an indicator when using the proposed models.

Ihor Kliushnikov, Vyacheslav Kharchenko, Herman Fesenko
Structural and Analytical Models for Early APT-Attacks Detection in Critical Infrastructure

Modern information and communication technologies (ICT) are vulnerable to APT-attacks (advanced persistent threats) and other relevant threats. APT-attack is a stealthy threat actor, typically a nation-state or state-sponsored group, which gains unauthorized access to ICT and remains undetected for an extended period. Early detection of APT-attack is very important for ICT of critical infrastructure. But existed approaches don’t allow to detect effectively in fuzzy environment (networks/cyberspace). In this paper, a method of linguistic terms using statistical data was used for structural and analytical models of parameters (both host and network parameters) as well as intruder model based on the defined host and networks parameters was developed. Based on this, logical rules can be developed to provide the functioning of IDS based on honeypot technology for APT-attacks detection and intruder type identification in ICT.

Zhadyra Avkurova, Sergiy Gnatyuk, Bayan Abduraimova
Mobile Application for Healthy Maternal Behavior to Reduce Fetal Mortality

Timely health care delivery is often based on solutions that provide informational assistance and support. The scope of mobile application in medicine is quite wide and relevant, which allows you to comply with the indicated principles and provisions in the field of health care for the preservation of health. Various factors and IT solutions were identified in order to provide various methods of informational support during pregnancy to a woman. One of the reasons for the study is the early prediction of risks based on attention deficit and the probabilistic manifestation of an event of a worsening of a positive state. The results of the analysis of modern mobile applications and systems made it possible to determine the priorities of functionality for the majority of users. In accordance with the requirements and rules defined for the medical organizations of the Republic of Kazakhstan, a mobile application was developed in order to reduce the risk during pregnancy and maintain a positive emotional state. The use of methods of closed analysis, risk assessment methods, methods for determining the emotional state, as well as methods for determining the relationship when monitoring the pregnancy process opens up wide opportunities for developing a model and structure of an application with improved functionality and characteristics.

Olimzhon Baimuratov, Sergiy Gnatyuk, Tolganay Salykbayeva
An Overlay Network Based on Cellular Technologies for the Secure Control of Intelligent Mobile Objects

The current state of the problem regarding secure control of smart mobile objects using existing tele-communication infrastructures is described in the paper. The scientific and technical task of developing the principles of using cellular communication systems for the organization of secure remote control of intelligent mobile objects has been solved. The solution lies in the use of overlay computer networks based on VPN tunneling. Proposals have been developed for constructing an overlay computer network based on VPN taking into account traffic aggregation as well as dividing network elements according to their purposes. The related problems of connection dependability at the access, distribution and core levels are considered. The possibility of using nested VPN tunneling in high-speed cellular communication systems to improve the security of transmitted data is analyzed. The result of a number of model and experimental studies is the proof of the proposed principles efficiency. Recommendations have been developed for using the proposed principles of constricting an overlay network based on cellular communication systems for the secure control of intelligent mobile objects in solutions related to the implementation of dependability and resilience of computer networks concept.

Vitalii Tkachov, Andriy Kovalenko, Vyacheslav Kharchenko, Kateryna Hvozdetska, Mykhailo Hunko

1st Ukrainian Natural Language Processing Workshop (UNLP 2021)

Frontmatter
Estimation of the Local and Global Coherence of Ukrainian Texts Using Transformer-Based, LSTM, and Graph Neural Networks

In this paper, the different models for the estimation of both local and global coherence of Ukrainian-language texts have been considered. In order to evaluate the local coherence of a document, Transformer-based and LSTM neural networks have been proposed with further training on a Ukrainian-language news corpus. It has been shown that the LSTM-based approach outperforms the corresponding network based on the Transformer architecture according to the accuracy metrics while solving typical tasks on both test datasets. In order to investigate the connection between sentences revealed by the neural network, the Uniform Manifold Approximation and Projection dimension reduction technique has been utilized for the projection of sentences’ embedding into 2D space. The clusters obtained may indicate the consideration of both the structure of a sentence and different types of connections between them by the designed model. In order to estimate the global coherence of a document, a model based on a graph convolutional neural network has been suggested. The appropriateness of taking into account the connection between all sentences despite their positions has been shown. The results obtained for the designed and trained global coherence estimation model may indicate the different aspects of the analysis of a text by the designed models that can lead to the usage of both local and global coherence estimation models according to an assigned task.

Artem Kramov, Sergiy Pogorilyy
Nonparametric Methods of Authorship Attribution in Ukrainian Literature

The paper presents the results of the comparison of two nonparametric methods of authorship identification of the Ukrainian literature texts. The paper describes the implementation of the corresponding methods based on the Klyushin–Petunin tests and its simplified version. The method of n-gram selection is applied. For testing a collection of texts up to 200,000 characters from 10 authors was used. As a result of carrying out the test, it was found out that the simplified test appears to be more sensitive and specific, and monograms and bigrams in opposite to trigrams provide clear detection of authorship.

Dmitriy Klyushin, Yulia Nykyporets
A Computational Lexicon of Ukrainian Discourse Connectives

We introduce a new lexicon of discourse connectives for the Ukrainian language. Discourse connectives like ‘because’, ‘therefore’ are grammatical elements which link clauses and sentences semantically and play a crucial role in discourse structure. They have shown to be useful for many tasks in natural language processing from argumentation mining to authorship analysis.We introduce a semi-automatic method for inventorizing discourse connectives in underresourced languages, by leveraging existing lexicons from other languages. As a result, we provide the first computer-readable lexicon of 129 Ukrainian discourse connectives. We provide syntactic as well as semantic information for these items. Finally, we carry out a small pilot study using the lexicon for discourse level corpus annotation, and report on the distribution of connectives in Ukrainian in two different types of media.

Tatjana Scheffler, Veronika Solopova, Olha Zolotarenko, Mariia Razno
Targeted Sentiment Analysis for Ukrainian and Russian News Articles

One of the most challenging problems in sentiment analysis is an analysis of multiple targets in the same text without clear boundaries between target contexts. This problem is even harder for Ukrainian and Russian languages because of the lack of datasets and established approaches. Responding to the business needs of our company, we created the bilingual dataset, manually annotated for targeted sentiment according to strict guidelines. This dataset allowed us to fine-tune a pre-trained multilingual BERT model and improve key metrics (macro F1, F1 for negative and positive classes) over the baseline models. As a by-product, we have trained new NER models for both languages. NER and targeted sentiment models were successfully introduced into the production environment at Semantrum.

Iuliia Makogon, Igor Samokhin
Ukrainian News Corpus as Text Classification Benchmark

One of the crucial problems of natural language processing for languages such as Ukrainian is lack of datasets both unlabeled (for pretraining of word embeddings or large deep learning models) and labeled (for benchmarking existing approaches).In this paper we describe a framework for simple classification dataset creation with minimal labeling effort. We create a dataset for Ukrainian news classification and compare several pretrained models for Ukrainian language in different training settings.We show that ukr-RoBERTa, ukr-ELECTRA and XLM-R tend to show the highest performance, although XLM-R tends to perform better on longer texts, while ukr-RoBERTa performs substantially better on shorter sequences.We publish this dataset on Kaggle ( https://www.kaggle.com/c/ukrainian-news-classification/ ) and suggest to use it for further comparison of approaches for Ukrainian text classification.

Dmytro Panchenko, Daniil Maksymenko, Olena Turuta, Mykyta Luzan, Stepan Tytarenko, Oleksii Turuta
Backmatter
Metadaten
Titel
ICTERI 2021 Workshops
herausgegeben von
Oleksii Ignatenko
Vyacheslav Kharchenko
Vitaliy Kobets
Hennadiy Kravtsov
Yulia Tarasich
Vadim Ermolayev
David Esteban
Vitaliy Yakovyna
Aleksander Spivakovsky
Copyright-Jahr
2022
Electronic ISBN
978-3-031-14841-5
Print ISBN
978-3-031-14840-8
DOI
https://doi.org/10.1007/978-3-031-14841-5

Premium Partner