Skip to main content

2020 | Book

Information and Communication Technologies in Education, Research, and Industrial Applications

15th International Conference, ICTERI 2019, Kherson, Ukraine, June 12–15, 2019, Revised Selected Papers

Editors: Prof. Vadim Ermolayev, Prof. Frédéric Mallet, Vitaliy Yakovyna, Prof. Dr. Heinrich C. Mayr, Aleksander Spivakovsky

Publisher: Springer International Publishing

Book Series : Communications in Computer and Information Science


About this book

This book contains extended versions of the best papers presented at the 15th International Conference on Information and Communication Technologies in Education, Research, and Industrial Applications, ICTERI 2019, held in Kherson, Ukraine, in June 2019.
The 19 revised full papers included in this volume were carefully reviewed and selected from 416 initial submissions. The papers are organized in the following topical sections: ​advances in ICT and IS research; ICT in teaching, learning, and education management; applications of ICT in industrial and public practice.

Table of Contents

Correction to: Information and Communication Technologies in Education, Research, and Industrial Applications
Vadim Ermolayev, Frédéric Mallet, Vitaliy Yakovyna, Heinrich C. Mayr, Aleksander Spivakovsky

Advances in ICT and IS Research

Automated Design of Parallel Programs for Heterogeneous Platforms Using Algebra-Algorithmic Tools
The further development of software tools based on algebra-algorithmic approach and term rewriting technique in the direction of automated design of programs for heterogeneous platforms using OpenCL is proposed. The method for semi-automatic parallelization of loop operators is developed. The particular feature of the approach consists in using high-level algebra-algorithmic program specifications. The tools automate design of parallel programs starting with construction of an algorithm scheme by superposition of operations of Glushkov’s system of algorithmic algebra and then synthesize corresponding source code in a target programming language on the basis of the scheme. The parallelization technique is based on loop tiling and data serialization and uses rewriting rules to transform programs. The application of the approach is illustrated by an example of developing an OpenCL interpolation program used in numerical weather forecasting. The results of the experiment consisting in executing the generated OpenCL program on a graphics processing unit are given.
Anatoliy Doroshenko, Oleksii Beketov, Mykola Bondarenko, Olena Yatsenko
Optimized Term Extraction Method Based on Computing Merged Partial C-Values
Assessing the completeness of a document collection, regarding terminological coverage of a domain of interest, is a complicated task that requires substantial computational resource and human effort. Automated term extraction (ATE) is an important step within this task in our OntoElect approach. It outputs the bags of terms extracted from incrementally enlarged partial document collections for measuring terminological saturation. Saturation is measured iteratively, using our \( thd \) measure of terminological distance between the two bags of terms. The bags of retained significant terms \( T_{i} \) and \( T_{i + 1} \) extracted at i-th and i + 1-st iterations are compared \( (thd(T_{i} ,T_{i + 1} )) \) until it is detected that \( thd \) went below the individual term significance threshold. The flaw of our conventional approach is that the sequence of input datasets is built by adding an increment of several documents to the previous dataset. Hence, the major part of the documents undergoes term extraction repeatedly, which is counter-productive. In this paper, we propose and prove the validity of the optimized pipeline based on the modified C-value method. It processes the disjoint partitions of a collection but not the incrementally enlarged datasets. It computes partial C-values and then merges these in the resulting bags of terms. We prove that the results of extraction are statistically the same for the conventional and optimized pipelines. We support this formal result by evaluation experiments to prove document collection and domain independence. By comparing the run times, we prove the efficiency of the optimized pipeline. We also prove experimentally that the optimized pipeline effectively scales up to process document collections of industrial size.
Victoria Kosa, David Chaves-Fraga, Hennadii Dobrovolskyi, Vadim Ermolayev
Expressibility in the Kleene Algebra of Partial Predicates with the Complement Composition
In the paper we investigate the expressibility of partial predicates in the Kleene algebra extended with the composition of predicate complement and give a necessary and sufficient condition of this expressibility in terms of the existence of an optimal solution of an optimization problem. We also investigate the expressibility in the first-order Kleene algebra with predicate complement. The obtained results may be useful for software verification using an extension of the Floyd-Hoare logic for partial pre- and postconditions.
Ievgen Ivanov, Mykola Nikitchenko
Program-Oriented Logics of Renominative Level with Extended Renomination and Equality
The formalism of program logics is the main instrument for software verification. Such logics are based on formal program models and reflect main program properties. Among various program logics, Floyd-Hoare logic and its variants take a special place because of its naturalness and simplicity. But such logics are oriented on total pre- and post-conditions, and in the case of partial conditions they become unsound. Different methods to overcome this problem were proposed in our previous works. One of the methods involves extension of program algebras with the composition of predicate complement. This permits to modify rules of the logic making them sound. Such modification requires introduction of undefinedness conditions into logic rules. To work with such conditions, an underlying predicate logic should become more expressive. In this paper we continue our research of such logics. We investigate a special program-oriented predicate logic called logic of renominative (quantifier-free) level with the composition of predicate complement, extended renomination, and equality predicate. This logic is a constituent part of the program logic. We introduce a special consequence relation for this logic, construct a sequent calculus, and prove its soundness and completeness.
Mykola Nikitchenko, Oksana Shkilniak, Stepan Shkilniak
SMT-LIB Theory of Nominative Data
In this article, we describe the theory of nominative data, formulate the basic principles of the composition-nominative approach, define the class of nominative data and functions, and describe a calculus for the Theory of Nominative Data. By using nominative data, we can increase the adequacy level of representation of data structures, functions, and compositions that are used in programming languages. Thus, in terms of composition-nominative approach, we can build program verification system based on a unified conceptual basis. Computer-aided verification of computer programs often uses SMT (satisfiability modulo theories) solvers. A common technique is to translate preconditions, postconditions, and assertions into SMT formulas in order to determine whether required properties can hold. The SMT-LIB Standard was created for forming a common standard and library for solving SMT problems. Now, it is one of the most used libraries for SMT systems. Formulas in SMT-LIB format are accepted by the great majority of current SMT solvers. The theory of nominative data is of interest for software modelling and verification, but currently lacks support in the SMT-LIB format. In the article, we propose the declaration for the theory of nominative data for the SMT-LIB Standard 2.6.
Liudmyla Omelchuk, Olena Shyshatska
Intelligent Support of the Business Process Model Analysis and Improvement Method
Since business process modeling is considered as the foundation of Business Process Management, it is required to design understandable and modifiable process models used to analyze and improve depicted business processes. Therefore, this article proposes a method for business process model analysis and improvement. The lifecycle of Business Process Management from business process modeling to applying the Business Intelligence and process mining techniques is considered. Existing approaches to business process model analysis are reviewed. Proposed method is based on best practices in business process modeling, process model metrics, and corresponding thresholds. The usage of business process model metrics and thresholds to formalize process modeling guidelines is outlined, as well as the procedure of business process model analysis and improvement is shown. The application of Business Intelligence techniques to support the proposed method is demonstrated, as well as the obtained results are shown and discussed.
Andrii Kopp, Dmytro Orlovskyi
The Use of Analogy to Simplify the Mathematical Description of the Didactical Process
The article presents a new method of modelling the didactical process using a developed educational network and the microsystems simulator. The didactical process can be represented in the intuitive form of a network of connected elements in a similar way to the electrical circuits. The network represents the differential equations describing a dynamic system which models the information flows as well as learning and forgetting phenomena. The solutions of the equations are more adequate than the direct formulas used in modelling i.e. the learning and forgetting curves known from the literature. The network variables and their meaning are relative to generalized variables defined in the generalized environment. This enables using any of the microsystems simulators and gives access to many advanced simulation and optimization algorithms. The use of the microsystems simulator enables simulation of the didactical process in time and prediction of effects also after its completion in the long-term. Based on the simulation, you can design a teaching process, as well as draw conclusions about the process itself and the composition of groups. Selected examples with a brief description have been included. The issues discussed in the work may be of interest to those involved in the analysis and mathematical description of the didactic process. The paper can be interesting for those who deal with modelling of the systems which incorporates the learning and forgetting process, in particular, in production processes or learning platforms.
Paweł Plaskura

ICT in Teaching, Learning, and Education Management

Developing a Mobile Augmented Reality Application for Enhancing Early Literacy Skills
Mobile applications allow connecting the world around us with digital information. Augmented Reality (AR) applications are commonly used in museums, education and for research. AR applications can attract users with contextual and locally specific information. We present a concept, design and development of augmented reality (AR) application “Tilsimli arifler” in which we create literate environment for motivating of children in practicing reading and writing. In order to enhance the kids reading ability we propose to develop a learning mobile application that allows using modern mobile technologies, in particular, using the technology of augmented reality, to study the alphabet and learn how to make words. In this case, a prerequisite was the use of special (drawn) cards and a camera of a mobile device. When scanning a card with a camera on a user’s smartphone, a three-dimensional image of the object deposited on the card should appear and which can be rotated in different directions (“spinning”) in real time.
Marlen Ablyaev, Afife Abliakimova, Zarema Seidametova
System for Testing Physics Knowledge
The model of the physics knowledge testing system is described. The paper presents an approach to building a system for testing procedural physics knowledge, i.e. knowledge of basic physics laws and the ability to use them.
This approach consists of constructing mathematical models for each academic module in a physics course. The main constructive objects are test templates, which are mathematical models of test tasks, based on physics models of systems, processes and phenomena. A template of a class of physical tests for testing knowledge of the laws of physics and the abilities of transformations of a physical system is represented by a set of elements of various formats. This set consists of geometric drawings, diagrams, graphs of functional dependencies, a system of formulas for converting physical quantities, patterns of scenarios for changing states of a physical system, and a response pattern.
Each such template can be used both in generating similarity algorithms for specific multiple tests, and in algorithms for automatically checking the correctness of answers. The proposed method allows describing a relatively simple class of specific tests. An important characteristic of the system is the ability to automatically check both the final answer and the parameters of the intermediate states of the physical system.
The implementation of a procedural physics knowledge testing system can be performed by creating software interactive multimedia objects using the methods of computer mathematics and algebraic programming technology. The model for the testing knowledge system in physics was implemented in the software module for the distance learning system. A model for testing the knowledge system in physics was implemented in the software module for the distance learning system.
Michail Lvov, Sergey Kuzmenkov, Hennadiy Kravtsov
Digital Learning Environment of Ukrainian Universities: The Main Components to Influence the Competence of Students and Teachers
Needs of digital transformation requires specific flexibility from modern universities to ensure the society demands implementation through innovative teaching and IC-technologies. Modern universities create a digital learning environment to support studying activities. This research presents an experts’ estimate of the current condition and perspectives of universities’ digital studying environments in Ukraine. We verified the theoretical model structure of the university digital studying environments by means of the empirical data factor analysis. We studied the components of the existing learning environment and enabling environment and compared them to the results of our previous research. We estimated the survey reliability and analyzed the influence of the digital learning environment formation on the respondents’ digital competence. We proved the digital learning environment theoretical model was correct. We proved that visions of students and teachers correspond to the key trends accelerating higher education technology adoption. We assume the digital learning environment development benefits overcoming significant challenges impeding higher education technology adoption and increases the level of digital competence. We assume these tendencies to have an impact on the students’ and teachers’ growing competitiveness.
Olena Kuzminska, Mariia Mazorchuk, Nataliia Morze, Oleg Kobylin
Complexity Theory and Dynamic Characteristics of Cognitive Processes
The features of modeling of the cognitive component of social and humanitarian systems have been considered. An example of using entropy multiscale, multifractal, recurrence and network complexity measures has shown that these and other synergetic models and methods allow us to correctly describe the quantitative differences of cognitive systems. The cognitive process is proposed to be regarded as a separate implementation of an individual cognitive trajectory, which can be represented as a time series and to investigate its static and dynamic features by the methods of complexity theory. Prognostic possibilities of the complex systems theory will allow to correct the corresponding pedagogical technologies. It has been proposed to track and quantitatively describe the cognitive trajectory using specially transformed computer games which can be used to test the processual characteristics of thinking.
Vladimir Soloviev, Natalia Moiseienko, Olena Tarasova

Applications of ICT in Industrial and Public Practice

Short-Term Electricity Price Forecasting: Deep ANN vs GAM
Examining the spot price series of electricity over the course of time, it is striking that the electricity price across the day takes a course that is determined by power consumption following a day and night rhythm. The daily course changes in its height and temporal extent in both, the course of the week, as well as with the course of the year. This study deals methodologically with this intra-day and seasonal behaviour. We compare the forecasting accuracy of Deep Artificial Neural Nets (ANN) of different architectures and Generalized Additive Models (GAM) and apply these models with European data.
Jan-Hendrik Meier, Stephan Schneider, Chan Le, Iwana Schmidt
Model of Functional Behavior of Healthcare Internet of Things Device Using Erlang Phase Method
The presented paper deals with exponentially growing technology – the Internet of Things (IoT) in the field of healthcare and medicine providing. The goal of the paper is to develop and research a discrete-continuous stochastic model (DCSM) of a functional behavior of a networked healthcare device (in the considered case – an insulin pump) in a form of a structural automaton model (SAM) using the Erlang phase method. It is spoken in the brief details about the healthcare IoT environment, networked insulin pump hazards and behavior with a description of the functional procedures, indicators and parameters of functionality and safety are given. Much attention is aimed at the development process of the DCSM using exponential and Erlang’s distribution laws, description of basic events and structure of a state vector, development of the SAMs. The procedures of validation of the developed models for the exponential and Erlang’s distribution laws are presented and include four research cases to check the relevance of the obtained results. The obtained results show the limit value of the task non-performing probability.
Anastasiia Strielkina, Bohdan Volochiy, Vyacheslav Kharchenko, Serhiy Volochiy
Multi-fragmental Markov’s Models for Safety Assessment of NPP I&C System Considering Migration of Hidden Failures
The information and control systems of Nuclear Power Plant and other safety critical systems are considered as a set of three independent hardware channels including online testing system. Nuclear Power Plant information and control systems design on programmable platforms is rigidly tied to the V-model of the life cycle. Safety and availability during its life cycle are assessed using Markov and multi-fragmental models. The multi-fragmental model MICS32 contains an absorbing state in case of hidden faults and allows evaluating risks of “hidden” unavailability. The MICS42 model simulates the “migration” of states with undetected failures into states with detected faults. These models describe the functioning of the system and the complete elimination of software faults. Results of multi-fragmental modeling are compared to evaluate proof test period taking into account requirements for SIL3 level and limiting values of hidden fault probabilities. Multi-fragment models are included in the assessing method of implementation safety requirements of ICS on programmable platforms. The information technology of decision support in assessing and managing the implementation of the requirements for ICS safety is also considered.
Vyacheslav Kharchenko, Yuriy Ponochovnyi, Artem Boyarchuk, Anton Andrashov, Ihor Rudenko
About One Approach to Modelling Dynamics of Network Community Opinion
The paper devoted to the study of dynamics of the network community opinion. The reason for arising this problem is the observed growth of manipulations by public opinion with using modern Information and Communication Technology. To control this situation, we need precise knowledge about factors influencing on public opinion evolution for different kinds of communities. This knowledge cannot be obtained by observations only. In the paper, therefore authors propose some framework for mathematical and simulating modelling evolution of network community opinion. Based on this framework, studying some kinds of communities is carrying out.
Grygoriy Zholtkevych, Olena Muradyan, Kostiantyn Ohulchanskyi, Sofiia Shelest
Our Approach to Formal Verification of Token Economy Models
The tokenomic modeling is one of the most efficient approaches for understanding and prediction of its subject’s behavior. There are many tools for simulation modeling of behavior in different domains but there is lack of examples in literature on the usage for tokenomic modeling. The paper considers the formal methods approach for tokenomic modeling. Thus, this article provides a brief description of the technology and the methods and tools developed by the authors for token economy modeling and for the analysis and study of its properties. The article also describes the formalization of the tokenomics model on the example of the SKILLONOMY project and presents the specific and symbolic SKILLONOMY models and its simulation results. The formalization and properties analysis is considered with usage of insertion modeling platform.
Oleksandr Letychevskyi, Volodymyr Peschanenko, Maksym Poltoratskyi, Yuliia Tarasich
Using Trading System Consolidated Models in Stock Exchange Price Forecasting
For successful trading on stock exchanges, it is important to use trading tools that will ensure success in trading operations and provide competitive advantages. The purpose of the article is to develop an algorithm for the creation of a trading system and selecting a research object whose shares may subsequently become the object of real trade. The basis of the developed trading system is the consolidated mathematical model based on several models (multipliers, neural network and discounted cash flows). Two forecasting models were created. The first consolidated model estimates share prices with lower deviation from the actual prices than the prices predicted by other mathematical models. For the second consolidated model, TakeProfit at the forecasted level was set. It allows closing the position as soon as a targeted price is reached. The results of the work identified directions for improving trading algorithms by use of the elements of fundamental analysis, namely, by forecasting the impact of macroeconomic factors and an evaluation of market indices.
Liubov Pankratova, Tetiana Paientko, Yaroslav Lysenko
Simulation as a Method for Asymptotic System Behavior Identification (e.g. Water Frog Hemiclonal Population Systems)
Studying any system requires development of ways to describe the variety of its conditions. Such development includes three steps. The first one is to identify groups of similar systems (associative typology). The second one is to identify groups of objects which are similar in characteristics important for their description (analytic typology). The third one is to arrange systems into groups based on their predicted common future (dynamic typology).
We propose a method to build such a dynamic topology for a system. The first step is to build a simulation model of studied systems. The model must be undetermined and simulate stochastic processes. The model generates distribution of the studied systems output parameters with the same initial parameters. We prove the correctness of the model by aligning the parameters sets generated by the model with the set of the original systems conditions evaluated empirically. In case of a close match between the two, we can presume that the model is adequately describing the dynamics of the studied systems. On the next stage, we should determine the probability distribution of the systems transformation outcome. Such outcomes should be defined based on the simulation of the transformation of the systems during the time sufficient to determine its fate. If the systems demonstrate asymptotic behavior, its phase space can be divided into pools corresponding to its different future state prediction. A dynamic typology is determined by which of these pools each system falls into.
We implemented the pipeline described above to study water frog hemiclonal population systems. Water frogs (Pelophylax esculentus complex) is an animal group displaying interspecific hybridization and non-mendelian inheritance.
Dmytro Shabanov, Marina Vladymyrova, Anton Leonov, Olga Biriuk, Marina Kravchenko, Quentin Mair, Olena Meleshko, Julian Newman, Olena Usova, Grygoriy Zholtkevych
Cluster Analysis of Countries Inequality Due to IT Development Through Macros Application
Industry 4.0 is likely to be a way to reach both economic efficiency and social equity. Nevertheless, nobody can predict the social impact of Industry 4.0 on society, which transforms into Society 4.0. The purpose of this paper is to prepare cluster analysis of countries inequality due to IT development using software package and macros application. We researched impact of gross capital formation, research and development expenditure to create innovations, intellectual property and high-technology exports on inequality of countries using principal component analysis based on open data 2012–2015. Cluster analysis allows countries to be divided into homogeneous groups that describe the impact of IT development on these countries in the absence of training samples. An isotonic or isomorphic algorithm is used for cluster analysis. PCA method is used as an isotonic algorithm. The isomorphic algorithm includes in groups countries which are close in structure. We presented advantages of isomorphic algorithm versus isotonic one using Macros Application which give us necessary links between countries to develop dendrite diagram for clusters of countries under impact of IT factors. Thus we have received 7 clusters describing the diverse impact of IT factors on nonuniform income distribution across 45 countries. First group of clusters shows that IT factors motivates to decrease economic inequality. Second group of clusters demonstrate that IT factors lead to inequality. Third group of clusters presents that other factors (not IT factors) generate decreasing of inequality. Fourth group of clusters prove that other factors (not IT factors) generates expanding of inequality.
Vitaliy Kobets, Valeria Yatsenko, Mykhaylo Voynarenko
Information and Communication Technologies in Education, Research, and Industrial Applications
Prof. Vadim Ermolayev
Prof. Frédéric Mallet
Vitaliy Yakovyna
Prof. Dr. Heinrich C. Mayr
Aleksander Spivakovsky
Copyright Year
Electronic ISBN
Print ISBN

Premium Partner