Skip to main content

2021 | Buch

Intelligent Methods in Computing, Communications and Control

Proceedings of the 8th International Conference on Computers Communications and Control (ICCCC) 2020

herausgegeben von: Prof. Dr. Ioan Dzitac, Simona Dzitac, Prof. Dr. Florin Gheorghe Filip, Prof. Janusz Kacprzyk, Prof. Dr. Misu-Jan Manolescu, Dr. Horea Oros

Verlag: Springer International Publishing

Buchreihe : Advances in Intelligent Systems and Computing

insite
SUCHEN

Über dieses Buch

This book presents the proceedings of the International Conference on Computers Communications and Control 2020 (ICCCC2020), covering topics such as theory for computing and communications, integrated solutions in computer-based control, computational intelligence and soft computing, decision-making and support systems.

The ICCCC was founded in Romania in 2006, and its eight editions have featured respected keynote speakers and leading computer scientists from around the globe.

Inhaltsverzeichnis

Frontmatter

Instead of Preface

Frontmatter
Redesign of a Conference from In-Person to Online. Case Study: ICCCC
Abstract
“Intelligent Methods for Computing, Communications and Control” volume, published by Springer in Advances in Intelligent Systems and Computing” Series, is in fact the Proceedings of the 8th International Conference on Computers Communications and Control (ICCCC) 2020. The ICCCC has been founded in 2006 by Ioan Dzitac, Florin Gheorghe Filip and Misu-Jan Manolescu, and was organized every even year by Agora University of Oradea, under the aegis of the Information Science and Technology Section of Romanian Academy. The first seven editions were organized face to face (in-person, traditional). Due to the COVID-19 pandemic, the 8th edition, ICCCC2020, which was designed initial to be in-person, we had redesign it as an online event (remotely). In this article we will present our study and conclusions regarding a parallel between the two types of conferences, traditional vs. online, each with advantages and disadvantages.
Ioan Dzitac, Simona Dzitac, Florin Gheorghe Filip, Misu-Jan Manolescu

Theory for Computing and Communications

Frontmatter
On Recursively Defined Combinatorial Classes and Labelled Trees
Abstract
We define and prove isomorphisms between three combinatorial classes involving labeled trees. We also give an alternative proof by means of generating functions .
Ali Chouria, Vlad-Florin Drăgoi, Jean-Gabriel Luque
Tight Bounds on the Coefficients of Consecutive k-out-of-n:F Systems
Abstract
In this paper we compute the coefficients of the reliability polynomial of a consecutive-k-out-of-n:F system, in Bernstein basis, using the generalized Pascal coefficients. Based on well-known combinatorial properties of the generalized Pascal triangle we determine simple closed formulae for the reliability polynomial of a consecutive system for particular ranges of k. Moreover, for the remaining ranges of k (where we were not able to determine simple closed formulae), we establish easy to calculate sharp bounds for the reliability polynomial of a consecutive system.
Vlad-Florin Drăgoi, Simon Cowell, Valeriu Beiu
Reliability of Two-Terminal Networks Equivalent to Small Optimal Sorting Nets
Abstract
Sorting networks are a special case of “oblivious” sorting algorithms that can be implemented directly in hardware. Their underlying non-plane connectivity graph representations can be mapped onto a certain class of minimal two-terminal networks, allowing us to associate a two-terminal reliability polynomial to any (optimal) sorting network connectivity graph. This class of networks is interesting in that it intersects the class of “matchstick minimal” two-terminal networks (which includes the planar Moore-Shannon hammocks), yet neither of these two classes contains the other. We compare the two-terminal reliability polynomials associated in this manner to small optimal sorting network connectivity graphs, with the reliability polynomials of Moore-Shannon hammock networks of equivalent dimensions.
Simon R. Cowell, Mariana Nagy, Valeriu Beiu
Investigating Hammock Networks on IBM Q
Abstract
https://static-content.springer.com/image/chp%3A10.1007%2F978-3-030-53651-0_5/MediaObjects/492183_1_En_5_Figa_HTML.png (https://​www.​ibm.​com/​quantum-computing/​) represents a great opportunity offered by IBM to the quantum research community allowing running experiments, through a web interface, on several of their quantum systems on the cloud. One of the great technical challenges to making viable quantum computers is their qubit fidelity (quality/reliability) together with a plethora of error correction techniques—which, obviously, link to reliability theory. Hammock networks (a device-level alternative to gate-level reliability schemes) have shown outstanding reliability enhancements in the classical digital domain (e.g., about two-orders of magnitude better than gate-level von Neumann multiplexing). In spite of such performances, device-level reliability schemes in general, and hammock networks in particular, have never been assessed for quantum computations. A likely explanation is that device-level reliability seems much more akin to topological quantum computing concepts. That is why we have decided to test if and how much hammock networks might help in the quantum realm. Instead of theoretical analyses we have decided to perform simulations on https://static-content.springer.com/image/chp%3A10.1007%2F978-3-030-53651-0_5/MediaObjects/492183_1_En_5_Figb_HTML.png (unfortunately still gate-level constrained), and we report our preliminary findings in this paper.
Sorin Hoară, Roxana-Mariana Beiu, Valeriu Beiu
Experimenting with Beta Distributions for Approximating Hammocks’ Reliability
Abstract
It is a well-known fact that, in general, the combinatorial problem of finding the reliability polynomial of a two-terminal network belongs to the class of \( \# P \)-complete problems. In particular, hammock (aka brick-wall) networks are particular two-terminal networks introduced by Moore and Shannon in 1956. Rather unexpectedly, hammock networks seem to be ubiquitous, spanning from biology (neural cytoskeleton) to quantum computing (layout of quantum gates). Because computing exactly the reliability of large hammock networks seems unlikely (even in the long term), the alternatives we are facing fall under approximation techniques using: (i) simpler ‘equivalent’ networks; (ii) lower and upper bounds; (iii) estimates of (some of) the coefficients; (iv) interpolation (e.g., Bézier, Hermite, Lagrange, splines, etc.); and (v) combinations of (some of) the approaches mentioned above. In this paper we shall advocate—for the first time ever—for an approximation based on an ‘equivalent’ statistical distribution. In particular, we shall argue that as counting (lattice paths) is at the heart of the problem of estimating reliability for such networks, the binomial distribution might be a (very) good starting point. As the number of alternatives (lattice paths) gets larger and larger, a continuous approximation like the normal distribution naturally comes to mind. Still, as the number of alternatives (lattice paths) becomes humongous very quickly, more accurate and flexible approximations might be needed. That is why we put forward the beta distribution (as it can match the binomial distribution), and we use it in conjunction with a few exact coefficients (which help fitting the tails) to approximate the reliability of hammock networks.
Simon R. Cowell, Sorin Hoară, Valeriu Beiu
General Frameworks for Designing Arithmetic Components for Residue Number Systems
Abstract
In many previous works, researchers have proposed Residue-based arithmetic components for the two classical moduli sets \((2^{p}, 2^p-1, 2^p+1)\), and \((2^{p}, 2^p-1, 2^{p-1}-1)\), where p is a positive integer. These components included reverse converters, sign identifiers, and scalers. In this paper, we are widening the umbrella of these two sets to be \((2^{k}, 2^p-1, 2^p+1)\), and \((2^{k}, 2^p-1, 2^{p-1}-1)\), where k is a positive integers such that \(0<k\le 2p \). The classical moduli sets are special cases of these expanded sets when \(p=k\). This paper introduces multiplicative inverses for these expanded moduli sets. The introduced multiplicative inverses will ease the process of designing residue-based arithmetic components. This paper also proposes general frameworks for designing reverse converters, sign identifiers, comparators, and scalers. Additionally, this work expands the options available for a designer willing to design a RNS processor based on these enhanced moduli sets.
Ahmad Hiasat
The Shape of the Reliability Polynomial of a Hammock Network
Abstract
Motivated by the study of hammock (aka brick-wall) networks, we introduce in this paper the notion of X-path, which generates all possible connections through the network. The new concept of X-path, together with the Jordan curve theorem for piecewise smooth curves, allow us to provide a direct proof of duality properties for hammock networks. Afterwards, we closely link the reliability polynomial of the hammock network having length l and width w to the reliability polynomial of the dual hammock network (of length w and width l). An important consequence is that the computations required for finding the reliability polynomials of all hammock networks are reduced by half.
Leonard Dăuş, Marilena Jianu

Integrated Solutions in Computer-Based Control

Frontmatter
Queuing Theory Application on DTN Buffer Management
Abstract
This paper aims to draw a parallel between the component elements of a queuing system and the buffer management used by the Delay Tolerant Network (DTN) nodes. Given that waiting in a queue is a widespread practice, many times it has been tried to optimize the time spent in such queues. The Introduction of the paper contains a briefly description for some elements of buffer management in DTN networks. The second section presents an initial information of queuing theory and several related works. The third section draws a parallel between buffer management in DTN and queuing systems by implementing a new drop policy as a part of buffer management. The obtained results will be illustrated with the help of a practical network context, using the ONE simulator.
Corina-Ştefania Nănău
Proposal of a Multi-standard Model for Measuring Maturity Business Levels with Reference to Information Security Standards and Controls
Abstract
The continuous security information risks force organizations to constantly update their security protocols. This implies, among other aspects, to base their monitoring mainly on their own maturity status in the SGSI (Information Security Managing System). When a Chief Information Security Officer elaborates a protection plan of IT assets, a wide and varied range of threats must be considered. These tasks are executed using conceptual models, which do not usually work in an integrated and systematic way. Thus, these models seek to increase maturity levels for protecting and safeguarding information security. Among the most common [1], we find COBIT 5, CSE-CMM, NIST-CRST to which we add the security standards like OWASP, ISO 27000-1, SANS. From here then, it is possible to see the lack of a multi-standard model that integrates systematically the individual actions with the expected results.
The present project proposes an integrated model that links and blends, on the one hand, the security standards and, on the other hand, the measurements of the organization’s maturity levels. By doing this, it is possible to count with a set of relevant actions, classified by evaluation categories, which provide conditions for crossing regulations and standardized controls. This finally allows to explore how efficient these acquired measures are, and, when needed, the corrections that should be introduced ahead.
Cristian Barria, David Cordero, Lorena Galeazzi, Alejandra Acuña
IT Solutions for Big Data Processing and Analysis in the Finance and Banking Sectors
Abstract
This paper aims to give a general overview of the technologies used by two important trends in Business Intelligence nowadays, that continue to reshape the Data Architecture landscape worldwide. Bringing equally relevant value to businesses today, Fast Data and Big Data complete each other in order to enable both quick/short term as well as thorough/long term commercial strategies of companies, regardless of the industry they are part of. The body and conclusion of this paper will focus on the benefits of using the newest FinTech solutions for both aforementioned data processing models, while clearly stating the differences between the two. Both open source and proprietary type of solutions will be presented with the purpose to offer a thorough picture as to what the best architectural landscape of Big Data analytics should look like.
Catalin Ceaparu
Automatons Immersed in Ocean Currents for Transformation of Biomass into Fuel
Abstract
When observing the Chilean sea from both biotic and mechanical perspective, oceanographers note that the Humboldt Current carries abundant biomass and that the movement of the water itself has the capacity to do work. Taking advantage of these two qualities of the ocean current, this article exposes the sketch of an automated device, the computational simulation when it was conceived and its mathematical model to make efficient the capture of biomass that will be processed, stored and dispatched as biodiesel. Said submerged automaton has a structural configuration that was outlined by cybernetic design resulting in a body that carries out the transformation process by itself, which starts on the side that faces the current with its content of biomass. This raw material is trapped thanks to an intelligent system that informs the reactor about the relative importance of the state variables that its body can control, stimulating those swimming organisms to move in the desired direction. The captured biomass begins its process until it becomes biodiesel by virtue of the mechanical energy provided by the same flow of seawater that affects the reactor. The rear part of the reactor releases both incident water and by-products into the sea without harmful environmental consequences. Some users of this new type of device are armies in time of conflict and merchant marines during algae bloom.
Lucio Cañete Arratia, Felisa Córdova, Andrés Pérez de Arce
Challenges Porting Blockchain Library to OpenCL
Abstract
This article discusses the complexities of porting a performance blockchain library, encompassing core cryptographic operations, to the OpenCL framework. We present the solution we developed as a general guideline and we highlight the limitations of the OpenCL framework. Given the potential use case of multiple platforms and devices, the effective portability of the library for end users is presented. Finally, a comparison with a CUDA variant of the library is discussed, both in terms of code complexity, runtime and performance.
Grigore Lupescu, Nicolae Tapus
Using Two-Level Context-Based Predictors for Assembly Assistance in Smart Factories
Abstract
The paper presents some preliminary results in engineering a context-aware assistive system for manual assembly tasks. It employs context-based predictors to suggest the next steps during the manufacturing process and is based on data collected from experiments with trainees in assembling a tablet. We were interested in finding correlations between the characteristics of the workers and the way they prefer to assemble the tablet. A certain predictor is then trained with correct assembly styles extracted from the collected data and assessed against the whole dataset. Thus, we found the predictor that best matches the assembly preferences.
Arpad Gellert, Constantin-Bala Zamfirescu

Computational Intelligence and Soft Computing

Frontmatter
Visual Analysis of Multidimensional Scaling Using GeoGebra
Abstract
The paper deals with the multidimensional scaling (MDS) that depends on the class of nonlinear projection methods for a visual representation of multidimensional data. The performance of a new MDS-type method for multidimensional data dimensionality reduction and visualization (Geometric MDS) has been investigated visually using GeoGebra. Dynamic geometry program GeoGebra is a non-commercial and interactive software for the visual representation of algebra and geometry. We made specific GeoGebra scripts for the visual representation of the convergence process of Geometric MDS. This allows us to analyze the optimization of the stress function visually, describing the visualization quality and find the basins of attraction to the local minima. The results allow an easier comprehension of the MDS stress optimization by the anti-gradient search. Moreover, the results deepen the understanding of Geometric MDS, in general.
Martynas Sabaliauskas, Gintautas Dzemyda
Edge Computing in Real-Time Electricity Consumption Optimization Algorithm for Smart Grids
Abstract
Nowadays the electricity consumption optimization represents a big improvement point for the electricity supplier, but also for the consumers. Both sides can benefit from the progress of sensors and ICT technologies and gain benefits if an automatically process is put in place. Hence, in this paper, we propose an algorithm which will monitor the electricity consumption and provide optimizations for each consumer, all in real time. For accurate monitoring outputs and better computation, the algorithm will run into a smart grid environment, where smart meters, actuator and appliances can be found and easily integrated. The proposed solution will be deployed in an edge computing environment. This architectural decision will make the final implementation more performant and less costly.
Răzvan Cristian Marales, Adela Bâra, Simona-Vasilica Oprea
The Study of Trajectories of the Development of State Capacity Using Ordinal-Invariant Pattern Clustering and Hierarchical Cluster Analysis
Abstract
This work is devoted to the methodology for identifying structurally close objects of the type “country_year” based on a system of indicators characterizing the state capacity 1996–2015. A comparison of clustering methods (including hierarchical clustering) with methods of analyzing patterns based on a pairwise comparison of indicators, ordinal-fixed and ordinal-invariant pattern clustering, is proposed. The possibility of sharing the methods of clustering and pattern analysis to obtain interpretable results from the point of view of political science is demonstrated. Groups of countries with similar development paths by reference years on the basis of a dynamic analysis of patterns are identified. The dynamic change in state capacity (from the point of view of the selected indicator system) of 166 countries of the world is determined.
Alexey Myachin, Andrei Akhremenko
A Micro Simulation Approach for a Sustainable Reduction Traffic Jam
Abstract
Public transport represents an important traffic flow in many countries and urban traffic management has to face this situation. Chile is not an exception, particularly in the case of the most populated cities. To deal with this scenario, a simulation model has been built for a study case in a Chilean city to demonstrate how the problem can be approached. This case shows a serious conflict of public and private transport vehicles. These conflicts generate longer travel times between homes, work or study, higher vehicles operating costs and environmental impacts. The objective of the study is to propose a simulation model to produce and evaluate action plans to reduce traffic jam and the generated conflicts. The VISSIMTM computational micro-simulation programme is used. The model simulates intersections where conflicts occur and it applies urban traffic management for an efficient use of roads. This computational micro-simulator uses a vehicle tracking model and lane change model plus other models that have been incorporated into this work. For the collection of information, field data were obtained such as vehicular flows, traffic light programming, speeds and measurement of queue lengths. Two models were simulated, where the best of them manages to mitigate the congestion problem and suggests changing the traffic, programming the traffic light from 120 s to 90 s and propose the use of a type short bus tracks for public transport. This result positively influence in speeds and queues, at 14% 18% respectively. Eventhough, the developed model solves a particular case, it can be tailored to other situations.
Alejandra Valencia, Cecilia Montt, Astrid M. Oddershede, Luis E. Quezada
Empirical Versus Analytical Solutions to Full Fuzzy Linear Programming
Abstract
We approach the full fuzzy linear programming by grounding the definition of the optimal solution in the extension principle framework. Employing a Monte Carlo simulation, we compare an empirically derived solution to the solutions yielded by approaches proposed in the literature. We also propose a model able to numerically describe the membership function of the fuzzy set of feasible objective values. At the same time, the decreasing (increasing) side of this membership function represents the right (left) side of the membership function of the fuzzy set containing the maximal (minimal) objective values. Our aim is to provide decision-makers with relevant information on the extreme values that the objective function can reach under uncertain given constraints.
Bogdana Stanojević, Milan Stanojević
Critical Analysis of Faults in Operation of Energy Systems Using Fuzzy Logic
Abstract
Nowadays, establishing an efficient regime of functioning for hydraulic installations represents a complex problem requiring a large amount of calculation. This paper presents a multi-criteria method using the Fuzzy logic to improve the functioning of a pumping station, by minimizing the electric energy consumption and maximizing its efficiency in critical regimes. This method supposes the minimization of the maximum flow rate pumped, by establishing an efficient time interval between the starting and stopping of the installation. It has a significant effect on the energetic efficiency during functioning. The numerical model includes the flow rate consumption correlated with its optimum parameters, by planning a proper time functioning. A new model non-deterministic is introduced, associated with a Fuzzy controller system improved with a model of inference algorithm. This model is used to reduce the consumed flow rate with the help of the linear optimization and transition from a branched network analyzed as a neural network, to an annular one. The neural network is structured on five input variables and 9 hidden layers (seven as sub-input and two as output). The schematic structure of the implemented neural network is presented associated with its main objective and improved functioning. The Fuzzy numerical model is tested with the Matlab software, using a permanent function for the input and output variables. Some of the numerical results, conclusions, and references are finally mentioned.
Victorita Radulescu
Fuzzy-Logic Based Diagnosis for High Voltage Equipment Predictive Maintenance
Abstract
This paper presents a fuzzy-logic algorithm for predictive maintenance (Industry 4.0) and at the same time for future design improving, applicable for high voltage equipment (switches, surge-arresters, etc.). Starting from this algorithm a software tool can be developed tool for high voltage equipment maintenance. It is an example of implementing advanced mathematical and software solutions for the maintenance of operational high voltage switching equipment (and other high voltage devices, too). For testing and validating the algorithm experimental data (and also experimental setups) were operated from the industrial environment (manufacturers and users of that equipment) and used for conceiving a monitoring and diagnosing digital based procedure, both for a more efficient design of that equipment as well as for efficiently assess the technical state of their main HV contacts. The results obtained are encouraging and recommend the use of the algorithm on a larger scale.
Mihaela Frigura-Iliasa, Attila Simo, Simona Dzitac, Flaviu Mihai Frigura-Iliasa, Felicia Ioana Baloi

Decision Making and Support Systems

Frontmatter
Making a Multi-criteria Analysis Model for Choosing an ERP for SMEs in a KM World
Abstract
There has never been such a rich offer in terms of ERP solutions that customers can choose from because the problem of knowledge has never been emphasized in the past. An ERP system in line with the company’s business is metamorphosed into a knowledge management tool that improves information transfer and generates “knowledge”. On the other hand, major decisions are rarely simple, and the best alternative can befall only after careful deliberation. In the present research we endeavor to provide a model of multiple-criteria decision-making (MCDM) in the selection of ERP applications. We illustrate and apply a MCDM technique, namely analytical hierarchy process (AHP) to assist SMEs to select the most appropriate ERP. We formulate an AHP decision model and apply it to a hypothetical case study to demonstrate the feasibility of the choice to select the most appropriate ERP software for a specific SME. We believe that our work could become a didactic source of inspiration for teaching decision making techniques to young people and students. The application of the proposed model indicates that it can be applied to improve decision-making processes and condense the time interval needed for ERP selection. Our model can be considered good practice and identified with the know-what and know-who component of a KM model.
Ramona Lacurezeanu, Vasile Paul Bresfelean
A Multi-Objective Model for Devices Procurement with Application in Health Care
Abstract
Managers need to make informed choices about what to buy in order to meet priority needs and to avoid wasting limited resources. The procurement decision is a very difficult task since there exists a great variety of brands, vendors and equipment performances. In the present paper, we have developed a decision process for equipment procurement in which are used, in combination, a Multi-Criteria subjective weighting method SWARA (Step-wise Weight Assessment Ratio Analysis) for equipment evaluation weights, an adaptation of SAW (Simple Additive Weighting) for equipment performance and a new Multi-Objective optimization model for equipment procurement. The Multi-Objective model considers several types of equipment, their costs and their performances. The model aims to be an aiding in the decision process of equipment procurement.
Managers of health care systems need to find in their choices a compromise between the cost of procurement, brands’ reputation, vendors’ reputation and equipment performance. A numerical example for medical equipment procurement, based on sensors, is studied.
Constanţa Zoie Rădulescu, Marius Rădulescu, Lidia Băjenaru, Adriana Alexandru
Methodological Proposal to Define the Degree of Automation in the Sanitary Industry in Chile to Adapt to Climate Change
Abstract
This study proposes a methodology to support the decision making to improve the efficiency of the technological standard in the sanitary industry facilities considering the climate change effects. Nowadays, the population necessities in terms of environment, quality and continuity of service are constantly increasing. In this regard, the sanitary industry is adopting new technologies for its processes, with the purpose that be a factor for service improvement. At present time, the Chilean sanitary industry is concern about the degree of automation and the infrastructure requirements, since they are the main critical factors for future investment planning. Therefore, it is necessary to determine the current level of the telecontrol system facilities and generate actions to make improvements in those processes that show a poor quality of service. The research methodology is based on case study, integrating planning processes, data analysis, scoring method interacting with multicriteria approach. This paper emphasis on developing a decision model by the use of the Analytical Hierarchy Process (AHP), to identify the priority facilities that should improve their technological standard. A case study incorporating climate change factors is pursued in a metropolitan sanitary company in Chile, accomplishing the automation degree of a telecontrol system for the real case. These results give place to elaborate an investment plan that can be converted into action plans for a sanitary company.
Claudio J. Macuada, Astrid M. Oddershede, Luis E. Quezada, Pedro I. Palominos
Mapping the Intellectual Structure of the International Journal of Computers Communications and Control: A Content Analysis from 2015 to 2019
Abstract
International Journal of Computers Communications & Control (IJCCC) is an open access peer-reviewed journal publishing original research papers and it is considered by professionals, academics and researches as one of the main sources of knowledge in the integrated solutions in computer-based control and communications, computational intelligence methods and soft computing, and advanced decision support systems fields. With this in mind, this research conducts a bibliometric performance and intellectual structure analysis of the IJCCC from 2015 to 2019. It provides a framework to support computer, communication and control researchers and professionals in the development and direction of future researches identifying core, transversal, emerging and declining themes. For this purpose, the IJCCC’s intellectual structure and thematic networks are analyzed according to the data retrieved from Web of Science Core Collection, putting the focus on the main research themes and its performance. Finally, this analysis has been developed using SciMAT, an open source (GPLv3) bibliometric software tool developed to perform a science mapping analysis under a longitudinal framework.
José Ricardo López-Robles, Manuel J. Cobo, Nadia Karina Gamboa-Rosales, Enrique Herrera-Viedma
A Model for Solving Optimal Location of Hubs: A Case Study for Recovery of Tailings Dams
Abstract
In this paper a method for optimal location of multi-hubs in a complex network with a large number of nodes is presented. The method is applied to the design of a logistics network composed of many tailings dams and mineral processing plants and combines two data mining techniques, K-Medoids and k-Means, with the multi-criteria decision making model PROMETHEE for the prioritization of nodes to be included into the clusters, based on certain technical and economic decision variables (such as the content of recoverable metals and the costs of transportation). The proposed method contributes to solve a large scale mathematical problem difficult to handle due to the number of variables and criteria. A case study for the recovery of abandoned deposits of mining waste is presented. The case study demonstrates the feasibility and usefulness of the proposed solution and lays the groundwork for further research and other applications of machine learning techniques for big data in support of sustainable production and a circular economy.
Rodrigo Barraza, Juan Miguel Sepúlveda, Juan Venegas, Vinka Monardes, Ivan Derpich
Backmatter
Metadaten
Titel
Intelligent Methods in Computing, Communications and Control
herausgegeben von
Prof. Dr. Ioan Dzitac
Simona Dzitac
Prof. Dr. Florin Gheorghe Filip
Prof. Janusz Kacprzyk
Prof. Dr. Misu-Jan Manolescu
Dr. Horea Oros
Copyright-Jahr
2021
Electronic ISBN
978-3-030-53651-0
Print ISBN
978-3-030-53650-3
DOI
https://doi.org/10.1007/978-3-030-53651-0

Neuer Inhalt