ITNG 2021 18th International Conference on Information Technology-New Generations
- 2021
- Book
- Editor
- Dr. Shahram Latifi
- Book Series
- Advances in Intelligent Systems and Computing
- Publisher
- Springer International Publishing
About this book
This volume represents the 18th International Conference on Information Technology - New Generations (ITNG), 2021. ITNG is an annual event focusing on state of the art technologies pertaining to digital information and communications. The applications of advanced information technology to such domains as astronomy, biology, education, geosciences, security, and health care are the among topics of relevance to ITNG. Visionary ideas, theoretical and experimental results, as well as prototypes, designs, and tools that help the information readily flow to the user are of special interest. Machine Learning, Robotics, High Performance Computing, and Innovative Methods of Computing are examples of related topics. The conference features keynote speakers, a best student award, poster award, service award, a technical open panel, and workshops/exhibits from industry, government and academia. This publication is unique as it captures modern trends in IT with a balance of theoretical and experimental work. Most other work focus either on theoretical or experimental, but not both. Accordingly, we do not know of any competitive literature.
Table of Contents
-
Frontmatter
-
AI and Robotics
-
Frontmatter
-
Chapter 1. Conceptualisation of Breast Cancer Domain Using Ontology
Reshmy Krishnan, P. C. Sherimon, Menila JamesThis chapter delves into the conceptualization of the breast cancer domain using ontology, emphasizing the role of intelligent decision support systems in clinical settings. It introduces the use of ontologies to define semantic knowledge and their application in data searching strategies. The proposed system, the Breast Cancer Prediction System, leverages three ontologies—questionnaire ontology, clinical guidelines ontology, and symptom ontology—to collect patient data and predict breast cancer risk and stage. The system architecture is detailed, showcasing the use of Protégé for ontology creation and SPARQL for querying. The chapter highlights the validation of ontologies and the potential for extending standardized ontologies to improve accuracy in breast cancer prediction.AI Generated
This summary of the content was generated with the help of AI.
AbstractThe conceptualization of the breast cancer domain using ontology is an emerging area of intelligent decision support system. Eventhough this is not the replacement for clinicians, this intelligent system can support them in an effective way during diagnosis. As the system requires data from clinicians and patients, the unorganized data is gathered and processed. As the input data are unstructured, it is hard to gather information and share knowledge from that. An adaptive questionnaire is used to gather data to optimize the result of the system. The paper discusses the prototype model which uses various ontology as the knowledge base, java engine to provide information to modeller and a reasoner to take effective decision. SPARQL is used to retrieve required information as per the conditions. Protégé that supports OWL representation provides a platform to build concepts and relationships. How ontology is representing the details of Breast cancer guidelines and how instances of the class are identified using the query are shown in this paper. -
Chapter 2. Traffic Light Control and Machine Learning: A Systematic Mapping Review
Dimitrius F. Borges, Edmilson M. Moreira, Adler D. de Souza, João Paulo R. R. LeiteThe chapter delves into the growing issue of traffic congestion due to increased vehicle fleets and the limitations of fixed-time traffic lights. It focuses on the application of Machine Learning, particularly Reinforcement Learning, to adapt traffic light control systems to real-time conditions. The study conducts a systematic mapping review of 132 articles, identifying the most prominent ML models and techniques used in this domain. Key findings include the prevalence of Q-learning and the need for more real-world testing of these models. The chapter also discusses the trends and gaps in the field, suggesting a future shift towards more complex and specialized RL models.AI Generated
This summary of the content was generated with the help of AI.
AbstractThe global vehicle fleet has grown rapidly over the past decade, impacting the way traffic is to be managed. Vehicle traffic management and control through technology is a well-known and widely studied problem that continues to present challenges and opportunities for action, mainly due to the growing demand, the mentioned increase in the vehicle fleet, and inefficiency of current systems, generally based on fixed-time traffic lights. Solutions have been presented for this scenario, and among them, Artificial Intelligence (AI) and Machine Learning (ML) techniques have stood out. The AI/ML field, however, is vast and varied. This article proposes a survey of the most used AI/ML techniques in the management of vehicular traffic lights, and it does so through a Systematic Mapping Review (SMR), pointing out models that receive greater focus, research trends and gaps. -
Chapter 3. Human-in-the-Loop Flight Training of a Quadcopter for Autonomous Systems
Luke Rogers, Alex RedeiThe chapter discusses the creation of a human-in-the-loop flight training system that integrates a quadcopter drone with a 2-axis 360-degree flight simulator. The system aims to provide an immersive experience by synchronizing drone movements with the simulator's actions, using telemetry data transmitted via UDP packets. The authors address challenges such as latency, data loss, and the integration of a joystick for control. Experimental results demonstrate the system's potential, with low latency and smooth correlations between drone and simulator movements. The chapter also explores the system's applications in various fields, including drone racing, search and rescue missions, and military use.AI Generated
This summary of the content was generated with the help of AI.
AbstractA software framework was developed connecting a Parrot AR 2.0 quadcopter to a full motion flight simulator at the Michigan Aerospace Center for Simulations. The combination of a drone with a flight simulator provides for precise remote operations without putting a human pilot at risk. We use the motion capabilities of our flight simulator to keep the pilot oriented consistently with the quadcopter. The result was a responsive system utilizing telemetry data to synchronize a flight simulator to a drone’s movement with low latency. The proposed system was developed over the course of 30 weeks and put through its paces in our lab over two days. This paper outlines our methods, from the software architecture, to a detailed description of the hardware, to accomplish this aim and outlines some directions for further study. -
Chapter 4. COVID-19: The Importance of Artificial Intelligence and Digital Health During a Pandemic
Maximilian Espuny, José S. da Motta Reis, Gabriel M. Monteiro Diogo, Thalita L. Reis Campos, Vitor H. de Mello Santos, Ana C. Ferreira Costa, Gildarcio S. Gonçalves, Paulo M. Tasinaffo, Luiz A. Vieira Dias, Adilson M. da Cunha, Nilo A. de Souza Sampaio, Andréia M. Rodrigues, Otávio J. de OliveiraThe chapter 'COVID-19: The Importance of Artificial Intelligence and Digital Health During a Pandemic' examines the pivotal role of advanced technologies in addressing the COVID-19 pandemic. It begins by contextualizing the pandemic and the rapid spread of SARS-CoV-2, emphasizing the need for swift and accurate diagnosis. The text delves into how Artificial Intelligence (AI) has been instrumental in enhancing medical diagnosis through X-rays and computed tomography, predicting virus spread, and accelerating drug development. The methodology involves a literature review of 413 indexed studies, with a focus on the 20 most cited articles. The results highlight key research trends and gaps, such as the use of AI for rapid diagnosis, risk management tools, and integration of I4.0 technologies into microbiology and clinical trials. The chapter concludes by underscoring the importance of these technologies in combating the pandemic and preparing for future health crises, encouraging further research and public awareness.AI Generated
This summary of the content was generated with the help of AI.
AbstractThe Covid-19 has brought about a major change in the way people live, work and interact. To face the challenges of the epidemic, health professionals and researchers have implemented several technologies from Industry 4.0. In order to elucidate the application of these technologies in the context of the pandemic, the objective of this article is to analyze the main research trends of the Technologies 4.0 from the main publications on the subject. Data collection was carried out in the Scopus database in September 2020 and 413 studies were identified. The gaps identified in this research were: Apply artificial intelligence and I4.0 technologies to support and speed up Covid-19 diagnosis, Implement Risk Management tools to prevent and mitigate new Covid-19 infection waves, Integrate I4.0 technologies into microbiology and clinical trials, Mapping and sharing data that identify transmission rates and Covid’19 diffusion routes, Search for treatment alternatives to Covid-19 through algorithms and artificial intelligence. The main academic contribution of this article was to systematize technological trends and understanding the influence of artificial intelligence and impact on the most urgent issues of the pandemic. -
Chapter 5. CropWaterNeed: A Machine Learning Approach for Smart Agriculture
Malek Fredj, Rima Grati, Khouloud BoukadiThe chapter 'CropWaterNeed: A Machine Learning Approach for Smart Agriculture' introduces a novel method for predicting crop water needs using machine learning algorithms. It addresses the challenges of traditional farming methods by integrating smart irrigation systems and advanced technologies. The CropWaterNeed approach combines weather observations, soil parameters, and plant water needs to build a robust prediction model. The methodology involves data collection, preparation, and feature engineering, followed by model selection and evaluation. The chapter highlights the use of the XGBRegressor algorithm, which demonstrated superior performance in predicting irrigation water needs. The work is part of the PRECIMED project, aiming to optimize water management practices in agriculture. The chapter concludes with potential future research directions, including the application of deep learning techniques.AI Generated
This summary of the content was generated with the help of AI.
AbstractIn this paper, we propose an approach CropWaterNeed in order to estimate and predict the future water needs and maximize the productivity in the irrigated areas. Unfortunately, we have not identified data available to be employed in such machine learning process in order to predict plants water needs. The proposed approach consists of extending the classic machine learning process. Particularly, we define a process to build dataset that contains plant water requirements. To collect data, we extract meteorological data from Climwat database and plants water requirements using Cropwat Tool. Then, we aggregate the extracted data into a dataset. Subsequently, we use the dataset to perform the learning process using XGBRegressor, Decision Tree, Random Forest and Gradiant Boost Regressor. Afterward, we evaluate the model generated by each algorithm by measuring the performance measures such as MSE, RMSE and MAE. Our work shows that the model generated by XGBRegressor is the most efficient in our case while Random Forest is the least efficient. As future work, we aim to apply the proposed process to test the performance of other regression algorithms and to test the impact of using deep learning techniques with the extracted data. -
Chapter 6. Machine Learning: Towards an Unified Classification Criteria
Clara Burbano, David Reveló, Julio Mejía, Daniel SotoThe chapter begins by discussing the surging popularity of Artificial Intelligence and Machine Learning, driven by their significant economic impact across various industries. It then delves into the definitions of AI and ML, clarifying the distinction between the two. The core of the chapter focuses on the classification of Machine Learning algorithms, exploring different criteria such as cognitive learning types (supervised, unsupervised, reinforced) and other factors influencing algorithm selection. The author presents a unified vision proposal, integrating various classification schemes into a cohesive framework. The chapter concludes with a call for future work to refine and expand this unified classification, reflecting the dynamic nature of the field.AI Generated
This summary of the content was generated with the help of AI.
AbstractIn a broad sense, Machine Learning (ML) is the performance optimization in a certain task through computational means, following a certain criterion and using referential data and/or past results from previous iterations. ML is a subset of Artificial Intelligence (AI) and has attracted a substantial amount of research during the last decades. This blooming subject led to the statement of different definitions for classifications, criteria, algorithms and so on. This paper summarizes these different definitions and proposes a homologation between them, providing an unified vision for each definition.
-
-
Cybersecurity I
-
Frontmatter
-
Chapter 7. Classification and Update Proposal for Modern Computer Worms, Based on Obfuscation
Hernaldo Salazar, Cristian BarriaThe chapter delves into the evolution of computer worms, tracing their origins back to the early 1970s with the Creeper worm. It explores various classifications proposed over the decades, culminating in a new proposal based on obfuscation techniques. This classification is designed to enhance the detection and understanding of modern worms, addressing the increasing complexity and evasion tactics used by malware. The research methodology involves a systematic review of academic articles, highlighting the need for a hierarchical classification to combat the rapid evolution of malware. The proposed model categorizes worms by species, class, type, and evasion methods, providing a robust framework for cybersecurity professionals to combat these threats effectively.AI Generated
This summary of the content was generated with the help of AI.
AbstractComputer worms are a type of malware that have a complex technological structure and the ability to automatically create replicas of themselves without human interaction and distributing themselves to other computers connected to the network; they have a malicious code component that allows them to infect one computer and then use it to infect others. This cycle repeats itself, rapidly increasing the number of infected computers if action is not taken in time. Within this framework, the research is based on a systematic review of the methodology used to analyze scientific articles related to malware and specifically to computer worms. Through this review and the abstraction of important data, a synthesis of the results is made to support the research, resulting in a new proposal for the classification of computer worms according to their obfuscation capacity, dividing it into four levels: species, type, class and evasion. This classification allows a modern computer worm to be categorized in such a way that the main contribution is that it can serve as a model or as a complement to an Information Security Management System (ISMS), in the systems responsible for detecting and/or defending organizations against worms attacks. -
Chapter 8. Conceptual Model of Security Variables in Wi-Fi Wireless Networks: Review
Lorena Galeazzi, Cristian Barría, Julio HurtadoThe chapter delves into the multivariable nature of security in Wi-Fi wireless networks, emphasizing the need for a conceptual model to represent key data. It reviews standards such as ISO/IEC 27001 and NIST 800-53, best practices like CIS controls, and relevant research studies to identify essential security variables. The study employs a narrative review methodology to extract and analyze data, culminating in a Venn diagram to illustrate the relationships between variables. The chapter concludes by establishing a conceptual model comprising Management, Technical Equipment, Protection of the Communication Channel, and End Users, with a particular focus on users as the weakest link in the security chain. This comprehensive approach offers valuable insights for enhancing the security of Wi-Fi wireless networks.AI Generated
This summary of the content was generated with the help of AI.
AbstractSystems, data, users, and networks are essential in terms of information security. Systems, data, users and networks are essential in terms of information security. Wi-Fi wireless networks play a crucial role in increasing connectivity, as well as preventing and monitoring unauthorized access. Nonetheless, Wi-Fi wireless networks’ security is conditioned by different variables incorporated in standards, norms, good practices, and various investigations concerning this topic.In this way, the present research exposes an information survey of certain variables based on a narrative review, which allows their identification and possibly the incorporation of others. The results obtained from the survey will be shown through a conceptual model, which allows visualizing the different aspects required in the security that is applied to this technology. -
Chapter 9. Cybersecurity Analysis in Nodes that Work on the DICOM Protocol, a Case Study
David Cordero, Cristian BarríaThe chapter delves into the cybersecurity analysis of nodes operating on the DICOM protocol, a standard for medical image exchange. It focuses on Chile, highlighting the critical nature of these systems in healthcare services. The study identifies active servers, analyzes their vulnerabilities, and assesses the potential impact on data integrity, confidentiality, and availability. Notably, it reveals that 22% of active servers in Chile have high-severity vulnerabilities, emphasizing the urgent need for robust cybersecurity measures to protect sensitive medical data.AI Generated
This summary of the content was generated with the help of AI.
AbstractCurrently the Internet is the main tool for interconnection of systems and data processing, SCADA systems, marketing, government; Even medical systems, among others, need to be connected to the internet to facilitate the development and processing of their data, the services that operate with the DICOM protocol (Digital Imaging and Communications in Medicine) work with medical equipment. This protocol is the universal format for the exchange of medical images, due to which it is used worldwide for communication between devices, the so-called PACS servers (Image Archiving and Communication System). These are information receptacles where medical centers store X-rays, files, personal information of patients, information of the treating physician, among others. The purpose of this research is to carry out a cybersecurity analysis in the operational and connected nodes in Chile that operate with the DICOM protocol among their services, with the execution of a modified experimental design that will allow the discovery of active nodes, discovery of exposed services and vulnerabilities, the analysis of said services as well as their vulnerabilities, their categorization and finally the validation of the vulnerabilities found. It seeks to know the current situation in cybersecurity issues of the nodes that use the DICOM protocol for communication, identifying the possible attack vectors that third parties may use in order to compromise the integrity, confidentiality, availability and authenticity of said systems. -
Chapter 10. Hybrid Security Risk Assessment Model
Robert Banks, Jim Jones, Noha Hazzazi, Pete Garcia, Russell ZimmermannThe chapter introduces a hybrid security risk assessment model that utilizes Bayesian Belief Networks (BBN) to quantitatively estimate the exploitability and impact of new technologies. By integrating public data from MITRE’s CAPEC tools and the NVD, the method provides a more accurate and reliable risk management system. The model is designed to be flexible, accommodating various data sources and levels of abstraction, making it a valuable tool for cybersecurity experts and risk management specialists. The chapter also discusses the limitations of existing risk estimation models and highlights the advantages of the proposed BBN approach, including its ability to handle complex, real-world scenarios and provide actionable insights for decision-makers.AI Generated
This summary of the content was generated with the help of AI.
AbstractCybersecurity risk management often uses experience-based data to quantify the potential risks of new security technologies based on their exploitability and impact. However, use of such data may be limited and is rarely reusable because it often contains confidential information. This paper proposes a new approach using the Department of Homeland Security’s public National Vulnerability Database (NVD) for information on known vulnerabilities, and MITRE’s public Common Attack Pattern Enumeration and Classification (CAPEC™) tools as the basis of a risk scoring system. -
Chapter 11. Enriching Financial Software Requirements Concerning Privacy and Security Aspects: A Semiotics Based Approach
Leonardo Manoel Mendes, Ferrucio de Franco Rosa, Rodrigo BonacinThis chapter introduces a novel method called SRAM-PS, which leverages organizational semiotics to systematically enrich financial software requirements with privacy and security aspects. The method is designed to address the complex interplay of technical, legal, organizational, and social factors involved in software development. The authors present a detailed background on requirements elicitation, secure systems development, and organizational semiotics, highlighting the gaps in current methods. The SRAM-PS method is outlined in seven steps, each designed to identify, analyze, and evaluate stakeholders' interests and problems, culminating in the generation of comprehensive security and privacy requirements. A case study demonstrates the practical application of SRAM-PS in a real-world scenario, showcasing its effectiveness and potential challenges. The chapter concludes by emphasizing the value of SRAM-PS for researchers and practitioners aiming to systematize their methods and techniques in software requirements analysis.AI Generated
This summary of the content was generated with the help of AI.
AbstractEnriching software requirements with key security and privacy features requires professionals to have knowledge of requirements elicitation techniques, based on systematic processes and methods. We propose the Software Requirements Analysis Method for Improvement of Privacy and Security (SRAM-PS), which is based on concepts and techniques from Organizational Semiotics and on the analysis of information security and data privacy standards. SRAM-PS is a 7-steps systematic approach where an input set of software requirements is analyzed, processed, and then enriched with new security and privacy requirements. A case study with 4 experts was carried out, where SRAM-PS is used in a real world scenario: a bank sends a financial transaction receipt containing the customer’s personal data over the Internet. SRAM-PS is aimed at researchers and engineers who analyze and specify software requirements and need to systematize their methods and techniques. -
Chapter 12. Efficient Design of Underwater Acoustic Sensor Networks Communication for Delay Sensitive Applications over Multi-hop
Ahmed Al Guqhaiman, Oluwatobi Akanbi, Amer Aljaedi, C. Edward ChowThis chapter delves into the critical aspects of designing efficient underwater acoustic sensor networks for delay-sensitive applications. It highlights the challenges and considerations in monitoring underwater environments, emphasizing the importance of MAC protocols and their impact on network performance. The study includes a detailed analysis of the effects of data rates, network sizes, packet sizes, and network loads on metrics such as end-to-end delay, energy consumption, packet delivery ratio, and collision rate. The authors compare different MAC protocols and underwater commercial modems, providing valuable insights into optimizing underwater communication systems for applications like oil/gas pipeline monitoring.AI Generated
This summary of the content was generated with the help of AI.
AbstractUnderwater Acoustic Sensor Networks (UASNs) play a critical role in the remote monitoring of a wide range of time-sensitive underwater applications, such as in the oil/gas pipeline to avoid oil spills. In this type of application, the transmission of collected information to the onshore infrastructure within a period of time is critical. Despite the advantages of UASNs over the limitations of Terrestrial Wireless Sensor Networks (TWSNs), the applicability of UASNs in different use-cases requires further investigation. In this paper, we investigate different MAC protocols and study the impact of non-environmental factors that may degrade performance. We simulate different MAC protocol approaches based on available underwater commercial modems to find the most efficient MAC protocol approach for the oil/gas industry based on core performance metrics. Our extensive simulation results show that the contention-based random access approach is the most suitable for time-sensitive application where the Network Size (NS) followed by Network Load (NL), Data Rate (DR), and Packet Size (PS), respectively have the strongest impact on delay.
-
- Title
- ITNG 2021 18th International Conference on Information Technology-New Generations
- Editor
-
Dr. Shahram Latifi
- Copyright Year
- 2021
- Publisher
- Springer International Publishing
- Electronic ISBN
- 978-3-030-70416-2
- Print ISBN
- 978-3-030-70415-5
- DOI
- https://doi.org/10.1007/978-3-030-70416-2
Accessibility information for this book is coming soon. We're working to make it available as quickly as possible. Thank you for your patience.