Skip to main content

2020 | Buch

Computational Science and Its Applications – ICCSA 2020

20th International Conference, Cagliari, Italy, July 1–4, 2020, Proceedings, Part II

herausgegeben von: Prof. Dr. Osvaldo Gervasi, Beniamino Murgante, Prof. Sanjay Misra, Dr. Chiara Garau, Ivan Blečić, David Taniar, Dr. Bernady O. Apduhan, Ana Maria A.C. Rocha, Prof. Eufemia Tarantino, Prof. Carmelo Maria Torre, Prof. Yeliz Karaca

Verlag: Springer International Publishing

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

The seven volumes LNCS 12249-12255 constitute the refereed proceedings of the 20th International Conference on Computational Science and Its Applications, ICCSA 2020, held in Cagliari, Italy, in July 2020. Due to COVID-19 pandemic the conference was organized in an online event.
Computational Science is the main pillar of most of the present research, industrial and commercial applications, and plays a unique role in exploiting ICT innovative technologies.
The 466 full papers and 32 short papers presented were carefully reviewed and selected from 1450 submissions. Apart from the general track, ICCSA 2020 also include 52 workshops, in various areas of computational sciences, ranging from computational science technologies, to specific areas of computational sciences, such as software engineering, security, machine learning and artificial intelligence, blockchain technologies, and of applications in many fields.

Inhaltsverzeichnis

Frontmatter

General Track 3: Geometric Modeling, Graphics and Visualization

Frontmatter
Conditionality of Linear Systems of Equation and Matrices Using Projective Geometric Algebra

Linear systems of equations and their reliable solution is a key part of nearly all computational systems and in a solution of many engineering problems. Mostly, the estimation of the matrix conditionality is used for an assessment of the solvability of linear systems, which are important for interpolation, approximation, and solution of partial differential equations especially for large data sets with large range of values.In this contribution, a new approach to the matrix conditionality and the solvability of the linear systems of equations is presented. It is based on the application of the geometric algebra with the projective space representation using homogeneous coordinates representation. However, the physical floating-point representation, mostly the IEEE 754-219, has to be strictly respected as many algorithms fail due to this.

Vaclav Skala
Research on the Input Methods of Cardboard

Virtual Reality (VR) is a new technology developed in the 20th century and the demands for VR from all walks of life are increasing, and Cardboard plays an important role in the demands. Facing the current situation that the existing Cardboard input methods cannot meet the actual needs well, aiming at the four typical input methods of Cardboard: buttons, gaze, voice and gestures, the paper conducts an experiment on typical object selection tasks and analyzes the experimental results in the sparse distribution, dense distribution, and sheltered dense distribution. The research results have important practical value for the design and development of the input of Cardboard based applications.

Zhiyi Ma, Hongjie Chen, Yanwei Bai, Ye Qiu
Conditionality Analysis of the Radial Basis Function Matrix

The global Radial Basis Functions (RBFs) may lead to ill-conditioned system of linear equations. This contribution analyzes conditionality of the Gauss and the Thin Plate Spline (TPS) functions. Experiments made proved dependency between the shape parameter and number of RBF center points where the matrix is ill-conditioned. The dependency can be further described as an analytical function.

Martin Červenka, Václav Skala
Information Visualization Applied to Computer Network Security
A Case Study of a Wireless Network of a University

Computer networks are becoming increasingly vital to the activities of organizations, and their monitoring is necessary to ensure proper functioning. The use of the human cognitive process in decision-making through information visualization (IV) is a viable option for large amounts of data, such as those generated in network monitoring. Considering the need to monitor modern computer networks and the quality gain when using visualization techniques, the objective was to conduct a review study to understand the process of building a monitoring tool using information visualization resources and, from this review, follow with a case study through a tool for visualization application in a university wireless network management. To this end, a systematic literature review was performed, and then a survey of requirements was conducted with the university network managers. With the analysis of the data from the review and the survey, a tool was specified and developed, evaluated in several units of the university. In this way, the results from the review and the requirements survey are observed in time, which allowed the development of a solution using the observed trends, validating them in the use and evaluation of the tool. The main contribution of the work is the resulting tool and its impact on the management of the university wireless network, facilitating the activities of managers.

Luiz F. de Camargo, Alessandro Moraes, Diego R. C. Dias, José R. F. Brega
Classification of Active Multiple Sclerosis Lesions in MRI Without the Aid of Gadolinium-Based Contrast Using Textural and Enhanced Features from FLAIR Images

Multiple sclerosis (MS) is an autoimmune demyelinating disease that affects one’s central nervous system. The disease has a number lesion states. One of them is known as active, or enhancing, and indicates that a lesion is under an inflammatory condition. This specific case is of interest to radiologists since it is commonly associated with the period of time a patient suffers most from the effects of MS. To identify which lesions are active, a Gadolinium-based contrast is injected in the patient prior to a magnetic resonance imaging procedure. The properties of the contrast medium allow it to enhance active lesions, making them distinguishable from nonactive ones in T1-w images. However, studies from various research groups in recent years indicate that Gadolinium-based contrasts tend to accumulate in the body after a number of injections. Since a comprehensive understanding of this accumulation is not yet available, medical agencies around the world have been restricting its usage to cases only where it is absolutely necessary. In this work we propose a supervised algorithm to distinguish active from nonactive lesions in FLAIR images, thus eliminating the need for contrast injections altogether. The classification task was performed using textural and enhanced features as input to the XGBoost classifier on a voxel level. Our database comprised 54 MS patients (33 with active lesions and 21 with nonactive ones) with a total of 22 textural and enhanced features obtained from Run Length and Gray Level Co-occurrence Matrices. The average precision, recall and F1-score results in a 6-fold cross-validation for active and nonactive classes were 0.892, 0.968, 0.924 and 0.994, 0.987, 0.991, respectively. Moreover, from a lesion perspective, the algorithm misclassified only 3 active lesions out of 157. These results indicate our tool can be used by physicians to get information about active MS lesions in FLAIR images without using any kind of contrast, thus improving one’s health and also reducing the cost of MRI procedures for MS patients.

Paulo G. L. Freire, Marcos Hideki Idagawa, Enedina Maria Lobato de Oliveira, Nitamar Abdala, Henrique Carrete Jr., Ricardo J. Ferrari
Automatic Positioning of Hippocampus Deformable Mesh Models in Brain MR Images Using a Weighted 3D-SIFT Technique

Automatic hippocampus segmentation in Magnetic Resonance (MR) images is an essential step in systems for early diagnostic and monitoring treatment of Alzheimer’s disease (AD). It allows quantification of the hippocampi volume and assessment of their progressive shrinkage, considered as the hallmark symptom of AD. Among several methods published in the literature for hippocampus segmentation, those using anatomical atlases and deformable mesh models are the most promising ones. Although these techniques are convenient ways to embed the shape of the models in the segmentation process, their success greatly depend on the initial positioning of the models. In this work, we propose a new keypoint deformable registration technique that uses a modification of the 3D Scale-Invariant Feature Transform (3D-SIFT) and a keypoint weighting strategy for automatic positioning of hippocampus deformable meshes in brain MR images. Using the Mann-Whitney U test to assess the results statistically, our method showed an average improvement of 11% over the exclusive use of Affine transformation, 30% over the original 3D-SIFT and 7% over the non-weighted point procedure.

Matheus Müller Korb, Ricardo José Ferrari, for the Alzheimer’s Disease Neuroimaging
Exploring Deep Convolutional Neural Networks as Feature Extractors for Cell Detection

Among different biological studies, the analysis of leukocyte recruitment is fundamental for the comprehension of immunological diseases. The task of detecting and counting cells in these studies is, however, commonly performed by visual analysis. Although many machine learning techniques have been successfully applied to cell detection, they still rely on domain knowledge, demanding high expertise to create handcrafted features capable of describing the object of interest. In this study, we explored the idea of transfer learning by using pre-trained deep convolutional neural networks (DCNN) as feature extractors for leukocytes detection. We tested several DCNN models trained on the ImageNet dataset in six different videos of mice organs from intravital video microscopy. To evaluate our extracted image features, we used the multiple template matching technique in various scenarios. Our results showed an average increase of 5.5% in the $$\text {F}_{1}$$ F 1 -score values when compared with the traditional application of template matching using only the original image information. Code is available at: https://github.com/brunoggregorio/DCNN-feature-extraction .

Bruno C. Gregório da Silva, Ricardo J. Ferrari
A Smartphone Application for Car Horn Detection to Assist Hearing-Impaired People in Driving

This paper presents a smartphone application that displays an onscreen alert and emits a vibration if a car horn is triggered in the traffic, aiming to assist hearing impaired people in driving vehicles. The stages of the construction process are detailed and ways of obtaining the sound frequency of a real-time noise with algorithms using Fast Fourier Transform are discussed, as well as the crossing of these with usual frequency ranges in car horns. The paper also discusses the problems faced in the detection of frequency bands in real traffic, related to the Doppler effect. The testing methodology includes simulations and uses the application in a real traffic environment. As a result of this work we obtained a functional application, customizable by the user, capable of detecting automotive horns.

Cleyton Aparecido Dim, Rafael Martins Feitosa, Marcelle Pereira Mota, Jefferson Magalhães de Morais
A Mathematica Package for Plotting Implicitly Defined Hypersurfaces in

Plotting implicitly defined geometric objects is a very important topic on computer graphics, computer aided design and geometry processing. In fact, the most important computer algebra systems include sophisticated tools for plotting implicitly defined curves and surfaces. This paper describes a new Mathematica package, 4DPlots, for plotting implicitly defined hypersurfaces (solids) in using a generalization of the bisection method that is applied to continuous functions of four variables by recursive bisection of segments contained in their domain. The output obtained is consistent with Mathematica’s notation and results. The performance of the package is discussed by means of several illustrative and interesting examples.

Luis A. Anto, Amelia M. Fiestas, Edgar J. Ojeda, Ricardo Velezmoro, Robert Ipanaqué

General Track 4: Advanced and Emerging Applications

Frontmatter
Strategic Capital Investment Analytics: An Agent Based Approach to California High-Speed Rail Ridership Model

In this paper, we present an agent-based model (ABM) of multi-dimensional transportation choices for individuals and firms given anticipated aggregate traveler demand patterns. Conventional finance, economic and policy evaluation techniques have already been widely adopted to more evidenced based decision-making process with the aim to understand the financial, economic and social impacts on transportation choices. Prior scholars have examined common practices used to measure profitability for investment appraisal including internal rate of return (IRR), net present value (NPV) and risk analysis approaches, incorporating the concepts of time value of money and uncertainty to assess potential financial gains with different transportation projects. However, using conventional capital budget planning or static scenario analysis alone cannot capture significant, interactive and nonlinear project, demand and market uncertainties. Here we build an agent-based model on the current California High-Speed Rail (HSR) to provide insights into firm investment decisions from a computational finance perspective, given the coupling of individual choices, aggregate social demand, and government policy and tax incentives. Given individual level choice and behavioral aspects, we combine financial accounting and economic theory to identify more precise marginal revenue streams and project profitability over time to help mitigate both project and potential, system market risk.

Mark Abdollahian, Yi Ling Chang, Yuan-Yuan Lee
The Value of Investing in Domestic Energy Storage Systems

In this paper, we investigate whether investments in battery storage systems, coupled with existing PV plants, are profitable in the phasing out of incentives. In detail, we analyze the investment decision of a household, who has already invested in a PV plant and has to decide whether and when to invest in the adoption of battery storage systems (BSS). We provide a Real Option Model to determine the value of the opportunity to invest and its optimal timing. The existing PV plant gives the household the opportunity to invest in BSS adoption, and this opportunity is analogous to a call option. Our findings show that negative NPV investments may turn to be profitable if the household optimally exercises the option to defer. The greater the volatility of energy prices, the greater the option value to defer, and the greater the opportunity cost of waiting (i.e., the greater the energy prices drift), the smaller the option value to defer.

Chiara D’Alpaos, Francesca Andreolli
Human Health Impact of E-Waste in Mexico

Mexico is the third electronic waste generator in America, only under the U.S. and Brazil. The main contribution of this study is a proposal of a sustainability model based on the generic cycle (Reduce, Recycle, Reuse) that allows extending the useful life of electronic waste through the reuse of components. It is known that the elements that contribute to this are: 1) the excessive technological consumerism, 2) the average short life cycle use of computers, and 3) the huge amount of mobile devices, batteries, and other gadgets left away due to obsolescence. These phenomena bring every day to increase the generation of electronic garbage, contributing environmental pollution and the poisoning of soil, water, and air. This work shows a methodology with the following elements; i) interviews, ii) survey, iii) data analysis, iv) correlations, and v) results over the impact of human health. In addition, the role they play in the context of problematic cities is discussed, with the companies that develop software that require greater hardware re-sources so that their applications work in the best way, which generates a limited lifetime in the devices and that the software that is built requires more resources for its optimal operation; thus, advancing the useful life and becoming technological waste. The new proposed model is based on; 1) awareness (reduce and reuse), 2) recycling (techniques and ways to reuse components so that it does not become garbage) and 3) the solution-prevention that helps minimize the impact on human health (reuse).

J. Leonardo Soto-Sumuano, José Luis Cendejas-Valdez, Heberto Ferreira-Medina, J. Alberto Tlacuilo-Parra, Gustavo Abraham Vanegas-Contreras, Juan J. Ocampo-Hidalgo
A General Model for Electroencephalography-Controlled Brain-Computer Interface Games

The rapid expansion of Brain-Computer Interface (BCI) technology allowed for the recent development of applications outside of clinical environments, such as education, arts and games. Games controlled by electroencephalography (EEG), a specific case of BCI technology, benefit from both areas, since they can be played by virtually any person regardless of physical condition, can be applied in numerous serious and entertainment contexts, and are ludic by nature. However, they also share the same challenges of design and development from both fields, especially since they demand numerous specific and specialized knowledge for their development. In this sense, this work presents a model for games using EEG-based BCI controls. The proposed model is intended to help researchers describe, compare and develop new EEG-controlled games by instantiating its abstract and functional components using concepts from the fields of BCI and games. A group of EEG-controlled games from the literature was selected to demonstrate the usefulness and representativeness of the model. The demonstration showed that an overview classification and the details of the selected games were able to be described using the model and its components.

Gabriel Alves Mendes Vasiljevic, Leonardo Cunha de Miranda

International Workshop on Advanced Transport Tools and Methods (A2TM 2020)

Frontmatter
Introducing the Concept of Interaction Model for Interactive Dimensionality Reduction and Data Visualization

This letter formally introduces the concept of interaction model (IM), which has been used either directly or tangentially in previous works but never defined. Broadly speaking, an IM consists of the use of a mixture of dimensionality reduction (DR) techniques within an interactive data visualization framework. The rationale of creating an IM is the need for simultaneously harnessing the benefit of several DR approaches to reach a data representation being intelligible and/or fitted to any user’s criterion. As a remarkable advantage, an IM naturally provides a generalized framework for designing both interactive DR approaches as well as readily-to-use data visualization interfaces. In addition to a comprehensive overview on basics of data representation and dimensionality reduction, the main contribution of this manuscript is the elegant definition of the concept of IM in mathematical terms.

M. C. Ortega-Bustamante, W. Hasperué, D. H. Peluffo-Ordóñez, M. Paéz-Jaime, I. Marrufo-Rodríguez, P. Rosero-Montalvo, A. C. Umaquinga-Criollo, M. Vélez-Falconi
Some Considerations on the Role of Universities and Research Centers in EU-Funded Sustainable Mobility Projects

Stakeholder involvement is now part of formal requirements of almost any transportation decision-making process in Europe, increasing the complexity while allowing for better, shared decisions. European institutions strongly promote participatory processes and have developed a regulatory framework as well as guidelines and tools for successful and effective public engagement in transport planning. In this context, a variety of EU funded projects have been set up where territorial partners cooperate with universities and research centers in developing a sustainable mobility project and related public engagement strategies. This paper digs into the history and the current state of stakeholder involvement in transport projects, discussing through a broad literature analysis the theoretical evolution of the concept, controversies, drivers for phases and tools for effective engagement practices. Through the examples of the experience within European projects SMILE and SMART COMMUTING, this paper explores the role that academic institutions can play in engagement processes and possible contributions in terms of technical expertise and know-how transfer. Intermediate results from the projects’ engagement efforts seem to validate the European Commission’s belief that planned, continuous, open and interactive involvement of Universities may bring to better, shared and desirable decisions, consistently with findings from recent literature.

Francesco Bruzzone, Silvio Nocera
A Preliminary Investigation of Machine Learning Approaches for Mobility Monitoring from Smartphone Data

In this work we investigate the use of machine learning models for the management and monitoring of sustainable mobility, with particular reference to the transport mode recognition. The specific aim is to automatize the detection of the user’s means of transport among those considered in the data collected with an App installed on the users smartphones, i.e. bicycle, bus, train, car, motorbike, pedestrian locomotion. Preliminary results show the potentiality of the analysis for the introduction of reliable advanced, machine learning based, monitoring systems for sustainable mobility.

Claudio Gallicchio, Alessio Micheli, Massimiliano Petri, Antonio Pratelli
GOOD_GO: An Open-Source Platform to Incentive Urban Sustainable Mobility

Good_Go is the first complete platform to incentive urban sustainable mobility. It contains different modules to incentive use of sustainable mobility like foot, bike, bus or innovative sharing solutions (car-pooling, car-sharing, bike-sharing and others). A first module of the platform is linked to a bike-theft disincentive system with the innovative Bluetooth OBU (On-Board-Unit) called BlueBI able to send an acoustic alarm in case of theft and allowing a participative finding of stolen bikes. A second module is linked to a rewarding platform while a third module allow to organize Mobility Management measures or sustainable mobility competition at different scale level (whole city, institutional system like hospital, university or single company/school).

Simone Giannecchini, Paolo Nepa, Alessio Rofi, Massimiliano Petri, Antonio Pratelli
Development of a Tool for Control Loop Performance Assessment

This article describes the primary characteristics of a tool developed to perform a control loop performance assessment, named SELC due to its name in Spanish. With this tool, we expect to increase the reliability and efficiency of productive processes in Colombia’s industry. A brief description of SELC’s functionality and a literature review about the different techniques integrated is presented. Finally, the results and conclusions of the testing phase were presented, performed with both simulated and real data. The actual data comes from an online industrial repository provided by the South African Council for Automation and Control (SACAC).

Javier Jiménez-Cabas, Fabián Manrique-Morelos, Farid Meléndez-Pertuz, Andrés Torres-Carvajal, Jorge Cárdenas-Cabrera, Carlos Collazos-Morales, Ramón E. R. González
Mobility Impacts of the Second Phase of Covid-19: General Considerations and Regulation from Tuscany (Italy) and Kentucky (USA)

The second phase of the virus Covid-19 is about to start a new configuration of accessibility to activities and cities. This phase, which will be able to see different restriction levels both between different countries and between successive periods, is the great challenge that the whole world is facing and which, if not managed in a planned and strategic way, risks turning into a further catastrophe. The social distancing rules imposed will necessarily lead to an escape from public transport in the cities, which could turn into total congestion of city traffic, leading the cities themselves to paralysis. We need a series of countermeasures that define new mobility capable of mitigating the effects of the mobility offer imbalance by intervening quickly, economically, and, in the short term, emergency on the whole transport chain. This article presents some possible actions to be put in place, and some mobility measures actually applied in Tuscany coastal area.

Irene Nicotra, Massimiliano Petri, Antonio Pratelli, Reginald R. Souleyrette, Teng (Alex) Wang

International Workshop on Advances in Artificial Intelligence Learning Technologies: Blended Learning, STEM, Computational Thinking and Coding (AAILT 2020)

Frontmatter
Model of Intelligent Massive Open Online Course Development

The relevance of development and extension of massive open online courses (MOOC) got a new wave of development due to the coronavirus pandemic. The importance and necessity of MOOCs will be increasing, however, intelligent systems changed qualitatively since the development of first MOOCs. The intelligent MOOC development with using Kazakh language thesaurus approach is suggested in this paper. The model of intelligent MOOC suggests laying its intellectuality at its designing, using the knowledge base, ontological model of discipline, and their relevant question-answer system and intelligent search. The separate important part of each such MOOC is the intelligent assessment of knowledge and achievement of training’s announced results. The suggested MOOC model makes it more effective means for distance, blended and any e-learning. The intelligent MOOC possesses a possibility of its using in e-learning systems without a tutor.

Gulmira Bekmanova, Assel Omarbekova, Zulfiya Kaderkeyeva, Altynbek Sharipbay
Flexible Model for Organizing Blended and Distance Learning

The proposed flexible model for the organization of mixed and distance learning involves the creation of an individual learning path for student testing before starting training. Based on the learning outcomes, the student is credited to the learning path. The learning path consists of mandatory and optional modules for training, optional modules can be skipped by passing the test successfully. The training model is represented using the ontological model, and the decision-making rules for the model are logical rules.

Gulmira Bekmanova, Yerkin Ongarbayev
Reshaping Higher Education with e-Studium, a 10-Years Capstone in Academic Computing

E-Studium has been a long-running project of blended e-learning for higher education based on the learning management system Moodle, implemented at University of Perugia, Italy from 2005 to 2015. The capstone culminated in a refined final product, at the basis of the actual academic platform Unistudium. In its ten-years activity, e-Studium has been a learning pathway experience for a variety of applications, included STEM courses, from high school education to high-specialisation academic courses, teacher’s qualification, and third mission for technology transfer, with a particular focus on usability and teacher’s self-evaluation. The analysis of both objective and subjective evaluations, collected over ten years from activity logs, web analytics, global rankings, and ad hoc questionnaires, together with teachers and students’ outcomes, shows how e-Studium contributed to reshaping the educational offer of large-scale learning in University of Perugia and Italy. This paper aims at showing the evolution and the outcomes of e-Studium, under the vision of the contemporary natural evolution of the technological learning economy, assessing and sharing educational experiences based on the evolution of innovative technologies for lifelong e-learning. A particular focus will be given on how such contribution can enhance the actual extraordinary situation, which sees a worldwide unexpected and abrupt need for remote communication due to the COVID-19 emergency.

Valentina Franzoni, Simonetta Pallottelli, Alfredo Milani

International Workshop on Advancements in Applied Machine Learning and Data Analytics (AAMDA 2020)

Frontmatter
BodyLock: Human Identity Recogniser App from Walking Activity Data

A person’s identity can be recognized based on his/her biometric data such as fingerprints, voice or gait. A person can also be recognized from his/her gait, which requires having sensors capable of detecting changes in speed and direction of movement. Such sensors are readily available on almost every smartphone model. We perform user identity verification using his/her walking activity data captured by smartphone sensors. To support identity verification, we have developed a mobile application for Android-based devices, which has achieved 97% accuracy of identity verification using data from acceleration, gravity and gyroscope sensors of a smartphone and a linear Support Vector Machine (SVM) classifier. The developed unobtrusive human walking analyser provides an additional active layer of protection, which may invoke a stronger authentication measure (mandatory locking) if a threat threshold is exceeded.

Karolis Kašys, Aurimas Dundulis, Mindaugas Vasiljevas, Rytis Maskeliūnas, Robertas Damaševičius
An Intelligent Cache Management for Data Analysis at CMS

In this work, we explore a score-based approach to manage a cache system. With the proposed method, the cache can better discriminate the input requests and improve the overall performances. We created a score based discriminator using the file statistics. The score represents the weight of a file. We tested several functions to compute the file weight used to determine whether a file has to be stored in the cache or not. We developed a solution experimenting on a real cache manager named XCache, that is used within the Compact Muon Solenoid (CMS) data analysis workflow. The aim of this work is optimizing to reduce maintaining costs of the cache system without compromising the user experience.

Mirco Tracolli, Marco Baioletti, Diego Ciangottini, Valentina Poggioni, Daniele Spiga
Software Defect Prediction on Unlabelled Datasets: A Comparative Study

Background: Defect prediction on unlabelled datasets is a challenging and widespread problem in software engineering. Machine learning is of great value in this context because it provides techniques - called unsupervised - that are applicable to unlabelled datasets. Objective: This study aims at comparing various approaches employed over the years on unlabelled datasets to predict the defective modules, i.e. the ones which need more attention in the testing phase. Our comparison is based on the measurement of performance metrics and on the real defective information derived from software archives. Our work leverages a new dataset that has been obtained by extracting and preprocessing its metrics from a C++ software. Method: Our empirical study has taken advantage of CLAMI with its improvement CLAMI+ that we have applied on high energy physics software datasets. Furthermore, we have used clustering techniques such as the K-means algorithm to find potentially critical modules. Results: Our experimental analysis have been carried out on 1 open source project with 34 software releases. We have applied 17 ML techniques to the labelled datasets obtained by following the CLAMI and CLAMI+ approaches. The two approaches have been evaluated by using different performance metrics, our results show that CLAMI+ performs better than CLAMI. The predictive average accuracy metric is around 95% for 4 ML techniques (4 out of 17) that show a Kappa statistic greater than 0.80. We applied K-means on the same dataset and obtained 2 clusters labelled according to the output of CLAMI and CLAMI+. Conclusion: Based on the results of the different statistical tests, we conclude that no significant performance differences have been found in the selected classification techniques.

Elisabetta Ronchieri, Marco Canaparo, Mauro Belgiovine
Artificial Intelligence in Health Care: Predictive Analysis on Diabetes Using Machine Learning Algorithms

Background: The healthcare organizations are producing heaps of data at alarming rate. This data comprises of medical records, genome-omics data, image scan or wearable medico device data that presents immense advantages and challenges at the same time. These ever growing challenges can be surpassed by applying effective artificial intelligence tools.Methods: This paper uses the large volume of multimodal patient data to perform correlations between Body Mass Index, Blood Pressure, Glucose levels, Diabetes Pedigree Function and Skin Thickness of people in different age groups with diabetes. Python and data analytic packages are used to predict diabetes among people.Results: The blood pressure count of diabetic people comes around 75–85 mmHg and sometimes even higher whereas it is in the range of 60–75 mmHg for non-diabetic people. The people with high body mass index and glucose levels of 120–200 mg/dl and more are found to be diabetic as against the lower body mass index with glucose levels of 85–105 mg/dl of normal people. The Diabetes Pedigree Function count of diabetic people has a peak at 0.25 whereas it is 0.125 in case of non-diabetic people. A similar slight difference in values of Age and Skin Thickness has been found for both diabetic and non-diabetic people.Conclusion: Above results indicate a strong relationship between Blood Pressure, BMI and Glucose levels of people with diabetes whereas a moderate correlation has been found between Age, Skin Thickness and Diabetes Pedigree Function count of people with diabetes. Although present analysis attested many of the previous research findings but getting these inferences matched through analytical tools is a sole purpose of this paper.

Shruti Wadhwa, Karuna Babber
Learning to Classify Text Complexity for the Italian Language Using Support Vector Machines

Natural language processing is undoubtedly one of the most active fields of research in the machine learning community. In this work we propose a supervised classification system that, given in input a text written in the Italian language, predicts its linguistic complexity in terms of a level of the Common European Framework of Reference for Languages (better known as CEFR). The system was built by considering: (i) a dataset of texts labeled by linguistic experts was collected, (ii) some vectorisation procedures which transform any text to a numerical representation, and (iii) the training of a support vector machine’s model. Experiments were conducted following a statistically sound design and the experimental results show that the system is able to reach a good prediction accuracy.

Valentino Santucci, Luciana Forti, Filippo Santarelli, Stefania Spina, Alfredo Milani

International Workshop on Advanced Computational Approaches in Artificial Intelligence and Complex Systems Applications (ACAC 2020)

Frontmatter
Deep Learning for Blood Glucose Prediction: CNN vs LSTM

To manage their disease, diabetic patients need to control the blood glucose level (BGL) by monitoring it and predicting its future values. This allows to avoid high or low BGL by taking recommended actions in advance. In this study, we propose a Convolutional Neural Network (CNN) for BGL prediction. This CNN is compared with Long-short-term memory (LSTM) model for both one-step and multi-steps prediction. The objectives of this work are: 1) Determining the best configuration of the proposed CNN, 2) Determining the best strategy of multi-steps forecasting (MSF) using the obtained CNN for a prediction horizon of 30 min, and 3) Comparing the CNN and LSTM models for one-step and multi-steps prediction. Toward the first objective, we conducted series of experiments through parameter selection. Then five MSF strategies are developed for the CNN to reach the second objective. Finally, for the third objective, comparisons between CNN and LSTM models are conducted and assessed by the Wilcoxon statistical test. All the experiments were conducted using 10 patients’ datasets and the performance is evaluated through the Root Mean Square Error. The results show that the proposed CNN outperformed significantly the LSTM model for both one-step and multi-steps prediction and no MSF strategy outperforms the others for CNN.

Touria El Idrissi, Ali Idri
Reconsidering the Risk Society: Its Parameters and Repercussions Evaluated by a Statistical Model with Aspects of Different Social Sciences

Risk society theory, with its complexity and inclusion of different ideas with respect to the peculiarity of risk in the modern era, the role played by the media in the construction and communication of risk as well as the nature of reflexivity, has had repercussions in various fields. Since multiple elements are to be analysed to reconsider the risk society theory in the current global landscape characterized by emerging threats, an integrated approach has been chosen in this study for a comprehensive evaluation. The majority of the studies conducted on the relevant subject matter lack quantitative analyses; therefore, this study aims at bridging a gap in this regard by elucidating the parameters and implications of the concepts related to the risk society based on the results obtained by quantitative and qualitative analyses. To this end, a survey (including demographic, sociological and psychological items) on risk and economic uncertainty was designed and conducted online. Secondly, content analysis was done on a set of news items focusing on global economy. The results of the survey evaluated with statistical analyses (ANOVA, t-test and correlation analysis) were used to relate the responses to different aspects and thematic qualities of the risk society theory postulated and developed by Ulrich Beck, Anthony Giddens and other social philosophers. The experimental results of the study revealed certain relationships between demographic characteristics and sociological-psychological elements of the risk society theory and its parameters in the individuals’ attitudes and perception. Correspondingly, repercussions of the risk society theory have been revealed in the news items handled. The majority of the results obtained support the key postulations of the risk society, which can shed light on understanding the significant transformation of our era along with the social attitudes, fears, insecurity and risk perception among individuals. In addition, the findings can lead to further interpretation of the interplay between economy, media, science and politics, opening up new perspectives toward reconsidering “contemporary” risks.

Ahu Dereli Dursun
Theory, Analyses and Predictions of Multifractal Formalism and Multifractal Modelling for Stroke Subtypes’ Classification

Fractal and multifractal analysis interplay within complementary methodology is of pivotal importance in ubiquitously natural and man-made systems. Since the brain as a complex system operates on multitude of scales, the characterization of its dynamics through detection of self-similarity and regularity presents certain challenges. One framework to dig into complex dynamics and structure is to use intricate properties of multifractals. Morphological and functional points of view guide the analysis of the central nervous system (CNS). The former focuses on the fractal and self-similar geometry at various levels of analysis ranging from one single cell to complicated networks of cells. The latter point of view is defined by a hierarchical organization where self-similar elements are embedded within one another. Stroke is a CNS disorder that occurs via a complex network of vessels and arteries. Considering this profound complexity, the principal aim of this study is to develop a complementary methodology to enable the detection of subtle details concerning stroke which may easily be overlooked during the regular treatment procedures. In the proposed method of our study, multifractal regularization method has been employed for singularity analysis to extract the hidden patterns in stroke dataset with two different approaches. As the first approach, decision tree, Naïve bayes, kNN and MLP algorithms were applied to the stroke dataset. The second approach is made up of two stages: i) multifractal regularization (kulback normalization) method was applied to the stroke dataset and mFr_stroke dataset was generated. ii) the four algorithms stated above were applied to the mFr_stroke dataset. When we compared the experimental results obtained from the stroke dataset and mFr_stroke dataset based on accuracy (specificity, sensitivity, precision, F1-score and Matthews Correlation Coefficient), it was revealed that mFr_stroke dataset achieved higher accuracy rates. Our novel proposed approach can serve for the understanding and taking under control the transient features of stroke. Notably, the study has revealed the reliability, applicability and high accuracy via the methods proposed. Thus, the integrated method has revealed the significance of fractal patterns and accurate prediction of diseases in diagnostic and other critical-decision making processes in related fields.

Yeliz Karaca, Dumitru Baleanu, Majaz Moonis, Yu-Dong Zhang
Multifractional Gaussian Process Based on Self-similarity Modelling for MS Subgroups’ Clustering with Fuzzy C-Means

Multifractal analysis is a beneficial way to systematically characterize the heterogeneous nature of both theoretical and experimental patterns of fractal. Multifractal analysis tackles the singularity structure of functions or signals locally and globally. While Hölder exponent at each point provides the local information, the global information is attained by characterization of the statistical or geometrical distribution of Hölder exponents occurring, referred to as multifractal spectrum. This analysis is time-saving while dealing with irregular signals; hence, such analysis is used extensively. Multiple Sclerosis (MS), is an auto-immune disease that is chronic and characterized by the damage to the Central Nervous System (CNS), is a neurological disorder exhibiting dissimilar and irregular attributes varying among patients. In our study, the MS dataset consists of the Expanded Disability Status Scale (EDSS) scores and Magnetic Resonance Imaging (MRI) (taken in different years) of patients diagnosed with MS subgroups (relapsing remitting MS (RRMS), secondary progressive MS (SPMS) and primary progressive MS (PPMS)) while healthy individuals constitute the control group. This study aims to identify similar attributes in homogeneous MS clusters and dissimilar attributes in different MS subgroup clusters. Thus, it has been aimed to demonstrate the applicability and accuracy of the proposed method based on such cluster formation. Within this framework, the approach we propose follows these steps for the classification of the MS dataset. Firstly, Multifractal denoising with Gaussian process is employed for identifying the critical and significant self-similar attributes through the removal of MS dataset noise, by which, mFd_MS dataset is generated. As another step, Fuzzy C-means algorithm is applied to the MS dataset for the classification purposes of both datasets. Based on the experimental results derived within the scheme of the applicable and efficient proposed method, it is shown that mFd_MS dataset yielded a higher accuracy rate since the critical and significant self-similar attributes were identified in the process. This study can provide future direction in different fields such as medicine, natural sciences and engineering as a result of the model proposed and the application of alternative mathematical models. As obtained based on the model, the experimental results of the study confirm the efficiency, reliability and applicability of the proposed method. Thus, it is hoped that the derived results based on the thorough analyses and algorithmic applications will be assisting in terms of guidance for the related studies in the future.

Yeliz Karaca, Dumitru Baleanu
Decision Tree-Based Transdisciplinary Systems Modelling for Cognitive Status in Neurological Diseases

This paper addresses the concept of an up-to-date transdisciplinary system modelling based on decision tree within the framework of systems theory. Systems theory constructs effective models for the analysis of complex systems since this comprehensive theory is capable of providing links between the problems and dynamics of systems. Particularly, for the complex and challenging environments, the solutions to the problems can be managed more effectively based on a systems approach. Neurological diseases concern the brain which has a complex structure and dynamics. Being equipped with the accurate medical knowledge plays a critical role in tackling these neurological problems. The interconnected relationships require a carefully-characterized transdisciplinary approach integrating systems conduct and mathematical modelling. Effective solutions lie in cognitive status, namely awareness and a satisfactory level of health knowledge. Within this framework, this study aims at revealing the lack of required general and medical health knowledge on neurological diseases (Alzheimer’s, dementia, Parkinson’s, stroke, epilepsy and migraine) among individuals. For this purpose, an online survey was conducted on 381 respondents, through which awareness on medical knowledge and general health knowledge were assessed for each disease. The following approaches (methods) were applied: firstly, rule-based decision tree algorithm was applied since its structure enables the interpretation of the data and works effectively with feature computations. Subsequently, statistical analyses were performed. The decision tree analyses and statistical analyses reveal parallel results with one another, which demonstrate that when compared with the knowledge of elder people, the knowledge of young population is limited in general and medical health knowledge. Compared with previous works, no related work exists in the literature where a transdisciplinary approach with these proposed methods are used. The experimental results demonstrate the significant difference between medical knowledge and general health knowledge among individuals depending on different attributes. The study attempts to reveal a new approach for dealing with diseases, developing positive attitudes besides establishing effective long-term behavioural patterns and strategies based on required knowledge and mindfulness.

Yeliz Karaca, Elgiz Yılmaz Altuntaş
Test Automation with the Gauge Framework: Experience and Best Practices

While Behavior-driven development (BDD) tools such as Cucumber are powerful tools for automated testing, they have certain limitations. For example, they often enforce strict syntax for test cases, like the “Given-When-Then” format, which may not always be easy to write for a given test case. A new test automation framework named Gauge ( gauge.org ) addresses that limitation since it does not prescribe the BDD testing process with a strict syntax. In Gauge, writing a test case is as easy as writing down the flow of test cases in several itemized sentences in a natural language, like English.In the context of Testinium ( testinium.com ), a large software testing company which provides software testing services, tools and solutions to a large number of clients, we have actively used the Gauge framework since 2018 to develop large automated front-end test suites for several large web applications.In this paper/talk, the speakers will share several examples and best practices of developing automated tests in natural-language requirements using the Gauge framework. By learning from the ideas presented in the talk, readers (attendees) will be able to consider applying the Gauge framework in their own test automation projects.

Vahid Garousi, Alper Buğra Keleş, Yunus Balaman, Zeynep Özdemir Güler
Coupling an Agent-Based Model with a Mathematical Model of Rift Valley Fever for Studying the Impact of Animal Migrations on the Rift Valley Fever Transmission

Rift valley fever (RVF) is a disease killing principally animals. In this article, we coupled a mathematical model of animal-mosquito interactions with an agent-based model describing the migrations of hosts between cities. The mathematical model describes animal-mosquito interactions in each city and the agent based-model describes the migrations of animals between cities. The coupled model allows to compute at each time the number of infected animals in all cities and to study the impact of host migrations on the dynamics of infections. The obtained results showed that quarantining certain cities can reduce the number of infected hosts. It is also observed that when the density of animal migrations increases, the number of infection cases increases. The developed model brings solutions to both models (mathematical model and agent-based model) limits. This model could help to study and forecasting the Rift Valley Fever transmission and its outbreak in the short and long term.

Paul Python Ndekou Tandong, Papa Ibrahima Ndiaye, Alassane Bah, Dethie Dione, Jacques André Ndione
Deep Learning Algorithms for Diagnosis of Breast Cancer with Maximum Likelihood Estimation

Machine Learning (ML) and particularly Deep Learning (DL) continue to advance rapidly, attracting the attention of the health imaging community to apply these techniques to increase the precision of cancer screening. The most common cancer in women is breast cancer that affects more than 2 million women each year and causes the largest number of deaths from cancer in the female population.This work provides state-of-the-art research on the contributions and new applications of DL for early diagnosis of breast cancer. Also, it emphasizes on how and which major applications of DL algorithms are going to be benefitted for early diagnosis of breast cancer for which CNNs, one of the DL architectures, will be used. In this study, a DL method to be used for diagnostic and prognostic analysis using the X-ray breast image dataset for breast cancer is studied. Based on the dataset, it is aimed to diagnose breast cancer at an early stage. Thus, it may take place before a clinical diagnosis. For the testing probability of the disease, 21400 X-ray breast images, both normal and cancer, were taken from USF mammography datasets. From these images, 70% is used as the training step, while 30% of images are benefitted for the testing step. After the implementation of the architecture, VGG16 has achieved an overall accuracy of 96.77%, with 97.04% sensitivity and 96.74% as AUC, while Inception-v4 has an overall accuracy of 96.67%, with 96.03% sensitivity and 99.88% as AUC. These results show the high value of using DL for early diagnosis of breast cancer. The results are promising.

Mehmet Akif Cifci, Zafer Aslan
Aeroacoustics Investigation of an Uncontrolled and a Controlled Backward-Facing Step with Large Eddy Simulation

In this study, 3D flow over backward-facing step is solved for low Mach number. Large Eddy Simulation is used to resolve turbulent features into the flow field. Numerical results of turbulent flow field are compared with experimental velocity profiles for different stations and those are validated. Additional to accounting a flow field, sound levels for different located receivers due to backward-facing step flow are determined using Acoustic Analogy (AA) which is firstly proposed by Lighthill. Unsteady flow field variables obtained from computational fluid dynamics commercial code is used as input in Ffowcs Williams-Hawkings (FW-H) Equation, which is an extended version of Lighthill equation. Additionally, some active flow control methodologies are carried out to a flow field to understand better relations between flow controls and acoustic results. These flow control techniques are the suction and blowing with different magnitudes at the bottom of backward-facing step. The controlled cases are also compared with experimental flow field results and validated. Acoustic results are plotted in frequency and time domain. Different active control effects on acoustic results are evaluated and interpreted.

Kamil Furkan Taner, Furkan Cosgun, Baha Zafer

International Workshop on Affective Computing and Emotion Recognition (ACER-EMORE 2020)

Frontmatter
Quantifying the Links Between Personality Sub-traits and the Basic Emotions

This article describes an exploratory study that aimed to analyse the relationship between personality traits and emotions. In particular, it investigates to what extent the sub-traits of the Five Factor Model has an empirically quantifiable correlation with the Basic Emotions (Anger, Anxiety, Disgust, Fear, Joy, Sadness, Surprise). If links between these personality traits and the basic emotions can be found, then this would enable an emotional-state-to-personality-trait mapping.In this study, 38 participants answered a Big Five Aspects Scale (BFAS) questionnaire and then watched 12 emotionally provocative film clips along with answering 12 short emotional Likert-scales on their emotional experiences during each film clip. The results showed that (i) four of the seven Basic Emotions outright significantly correlated, while two emotions (Fear and Disgust) approached statistical significance, with at least one of the personality traits and (ii) significant correlations between personality traits and basic emotions could only be identified at the sub-trait level, demonstrating the value in adopting a higher-resolution personality model.The results supports the long-term goal of this research, which is the enabling of state-to-trait inferences. A method for analysing and visualising such a mapping, that differentiates mappings based on the direction and magnitude of the effect size was also developed. The study contributes a blueprint towards utilising Affective Computing methodology to automatically map these phenomena.

Ryan Donovan, Aoife Johnson, Aine deRoiste, Ruairi O’Reilly
Method of Sentiment Preservation in the Kazakh-Turkish Machine Translation

This paper describes characteristics which affect the sentiment analysis in the Kazakh language texts, models of morphological rules and morphological analysis algorithms, formal models of simple sentence structures in the Kazakh-Turkish combination, models and methods of sentiment analysis of texts in the Kazakh language. The studies carried out to compare the morphological and syntactic rules of the Kazakh and Turkish languages prove their closeness by structure. In this respect, we can assume that taking into account sentiment in machine translation for these combinations of languages will give a good result at preserving the text meaning.

Lena Zhetkenbay, Gulmira Bekmanova, Banu Yergesh, Altynbek Sharipbay
Deep Convolutional and Recurrent Neural Networks for Emotion Recognition from Human Behaviors

Human behaviors and the emotional states that they convey have been studied by psychologist and sociologists. The tracking of behaviors and emotions is becoming more pervasive with the advent of the Internet of Things (IoT), where small and always connected sensors can continuously capture information about human gestures, movements and postures. The captured information about readable behaviors conveys significant information that can be represented as time series. Few studies in emotion recognition and affective computing have explored the connection between the time series sensors data and the emotional behavior they conveys. In this paper, an innovative approach is proposed to study the emotions and behaviors connected to the time series data. A convolutional network augmented with attention-based bidirectional LSTM is introduced to represent the correlations between behaviors and emotions. The advantage of this model is that it can well recognized emotions by exploiting the data captured by sensors. The experimental results show that the proposed deep learning method outperforms separate schemes and achieves a high degree of accuracy for modelling human behaviors and emotions.

James J. Deng, Clement H. C. Leung
Exploring Negative Emotions to Preserve Social Distance in a Pandemic Emergency

In this work, we present a multi-agent robotic system which explores the use of unpleasant emotions triggered by visual, sound and behavioural affordances of autonomous agents to interact with humans for preserving social distance in public spaces in a context of a pandemic emergency. The idea was born in the context of the Covid-19 pandemic, where discomfort and fear have been widely used by governments to preserve social distancing. This work does not implicitly endorse the use of fear to keep order but explores controlled and moderate automated exploitations. On the contrary, it deeply analyses the pros and cons of the ethical use of robots with emotion recognition and triggering capabilities. The system employs a swarm of all-terrain hexapods patrolling a public open space and generally having a discrete and seamless presence. The goal is to preserve the social distance among the public with effective but minimal intervention, limited to anomaly detection. The single agents implement critical tasks: context detection strategies, triggering negative emotions at different degrees of arousal using affordances ranging from appearance and simple proximity or movements to disturbing sounds or explicit voice messages. The whole system exhibits an emerging swarm behaviour where the agents cooperate and coordinate in a distributed way, adapting and reacting to the context. An innovative contribution of this work, more than the application, is the use of unpleasant emotions affordances in an ethical way, to attract user attention and induce the desired behaviour in the emergency. This work also introduces a method for assessment of the emotional level of individuals and groups of people in the context of swarm agents. The system extends the experience of the gAItano hexapod project, an autonomous agent with image detection and planned object relocation capabilities.

Valentina Franzoni, Giulio Biondi, Alfredo Milani

International Workshop on AI Factory and Smart Manufacturing (AIFACTORY 2020)

Frontmatter
A CPS-Based IIoT Architecture Using Level Diagnostics Model for Smart Factory

In this paper, a construction process using a level diagnostic agent was applied to the construction of a smart factory. The current status of the smart factory of the demanding company was measured and the target level was derived, and a cps-based design of the smart factory construction type was proposed. It is suggested that the construction of a CPS simulation based smart factory is more effective in preparation for cloud based smart factory manufacturing in the process of informatization, automation, and intelligence of the smart factory due to the explosive increase of data. In this paper, a Korean-type smart factory adopting an empirical research method that activates the actual construction cases of smart factory level diagnosis according to the basic components of the smart factory, information, automation, and intelligence, and the present examples of each smart factory level as an empirical case.

Byungjun Park, Jongpil Jeong
A Hydraulic Condition Monitoring System Based on Convolutional BiLSTM Model

In this paper, to monitor the conditions of hydraulic system, a real-time monitoring method based on convergence of convolutional neural networks (CNN) and a bidirectional long short-term memory networks (BiLSTM) is proposed. This method uses CNN and BiLSTM. In the CNN, the feature is extracted from the time-series data entered as an input, and in the BiLSTM, information from the feature is learned. Then, the learned information is sent to the Sigmoid classifier and it classified whether the system is stable or unstable. The experimental results show that compared to other deep learning models, this model can more accurately predict the conditions of the hydraulic system with the data collected by the sensors.

Kyutae Kim, Jongpil Jeong
Bearing Fault Detection with a Deep Light Weight CNN

Bearings are vital part of rotary machines. A failure of bearing has a negative impact on schedules, production operation and even human casualties. Therefore, in prior achieving fault detection and diagnosis (FDD) of bearing is ensuring the safety and reliable operation of rotating machinery systems. However, there are some challenges of the industrial FDD problems. Since according to a literature review, more than half of the broken machines are caused by bearing fault. Therefore, one of the important thing is time delay should be reduced for FDD. However, due to many learnable parameters in model and data of long sequence, both lead to time delay for FDD. Therefore, this paper proposes a deep Light Convolutional Neural Network (LCNN) using one dimensional convolution neural network for FDD.

Jin Woo Oh, Jongpil Jeong
Air Pollution Measurement Platform Based on LoRa and Blockchain for Industrial IoT Applications

Air pollution poses risks such as global warming and ecosystem changes. Contaminants such as harmful gas, particulate matter and residual material generated in factories are the main causes of air pollution. Therefore, there is a need for strict control of contaminants occurring in the factory. In this paper, we propose a new platform using the Internet of Things (IoT) and blockchain to monitor the air quality without restriction of space. The proposed platform provides a service that collects data in real-time through the IoT sensor device based on LoRa (Long Range) based on the contaminant generated in the factory and transmits the encrypted data to the cloud through the transaction technology of the blockchain. Using this platform, air pollution generated in factories can be managed in real-time and data integrity can be preserved. Through this study, it is possible to measure and monitor contaminants generated in the air and use them as important data to improve and overcome environmental problems caused by air pollution.

Yohan Han, Jongpil Jeong
Real-Time Inspection of Multi-sided Surface Defects Based on PANet Model

Quality of products is the most important factor in manufacturing. Machine vision is a technique that mainly performs human cognitive judgment in the industrial field or performs a task that is generally difficult for a human. However the detection of traditional methods of scanning with human eyes has many difficulties due to repetitive tasks. Recently, an artificial intelligence machine vision has been studied to improve these problems. Using the vision inspection system, it is possible to collect information such as the number of products, defect detection, and types without human intervention, which maximizes the operation-al efficiency of a company such as productivity improvement, quality improvement, and cost reduction. Most of the vision inspection systems currently in use are single-sided images, which collect and inspect one image of the product. However, in the actual manufacturing industry, products that are valid for single-sided image inspection are limited to some product groups, and most require multi-sided image inspection. In addition, the inspection system used in the field must meet the production speed required by the actual manufacturing site and inspect the defects of the product. In this paper, we propose a deep neural network-based vision inspection system that satisfies the multi-sided image inspection and fast production speed of products. By implementing seven cameras and optical technology, multi-sided images of the product are collected simultaneously, and a defect in the product can be quickly detected in real time using a PANet (Path Aggregation Network) model. Through the proposed system, it is possible to inspect product defects at the level required at the manufacturing site, and the information obtained in the inspection process will be used as a very important data to evaluate and improve product quality.

Yohan Han, Jongpil Jeong
Estimation of Greenhouse Gas Emissions in Cement Manufacturing Process Through Blockchain and SSL Based IoT Data Analysis

Recently, the Internet of Things (IoT) system, which supports human activities based on various types of real data, has attracted attention in various fields. However, the behavior of the system is determined based on the actual data, so if an attacker changes the data, a critical problem can occur on the system. Therefore, to ensure the reliability of real data, many researchers have studied data management platforms that use security management technology. In the proposed platform, the pre-manufacturing phase of the cement manufacturing process attempted to ensure the reliability of the generated data by having data produced using blockchain techniques through the distributed ledger of blockchain. In the manufacturing phase, Secure Socket Layer (SSL), also known as digital certificates, is also used to establish encrypted connections between the browser or user’s computer and the server or website. SSL connections protect sensitive data exchanged in each session from interference from unauthorized users. Design a management platform that can verify the integrity of stored data over proposed SSL and block chain. Testers collected in real time from the system provide integrity for the calculation of carbon emissions. You will have the opportunity to drastically reduce the manpower and time funds required for the current first and second stages of verification.

Byeongseop Kim, Myungsoo Kim, Jongpil Jeong
Non-intrusive Load Monitoring Based on Regularized ResNet with Multivariate Control Chart

With the development of industry and the spread of the Smart Home, the need for power monitoring solution technologies for effective energy management systems is increasing. Of these, non-intrusive load monitoring (NILM), is an efficient way to solve the electricity consumption monitoring problem. NILM is a technique to measure the power consumption of individual devices by analyzing the power data collected through smart meters and commercial devices. In this paper, we propose a deep neural network (DNN)-based NILM technique that enables energy disaggregation and power consumption monitoring simultaneously. Energy disaggregation is performed by learning a deep residual network for performing multilabel regression. Real-time monitoring is performed using a multivariate control chart technique using latent variables extracted through weights of the trained model. The energy disaggregation and monitoring performance of the proposed method is verified using the public NILM Electricity Consumption and Occupancy (ECO) data set.

Cheolhwan Oh, Jongpil Jeong

International Workshop on Air Quality Monitoring and Citizen Science for Smart Urban Management. State of the Art and Perspectives (AirQ&CScience 2020)

Frontmatter
Estimating Air Pollution Related Solar Insolation Reduction in the Assessment of the Commercial and Industrial Rooftop Solar PV Potential

Air pollution is a serious issue and it has been becoming increasingly urgent over last year, mainly as a result of its effects on human health. Recently, however, a number of scientific papers has appeared reporting on air pollution effects in other fields such as the photovoltaic energy generation and, notably, on the relation between the reduction of the solar insolation reaching the PV systems and PM2.5 concentrations in the air. In this study, the rooftop solar PV potential of commercial and industrial (C&I) buildings at regional scale has been estimated tacking into account the spatially distributed solar insolation reduction factor, due to the PM2.5 in the air. High resolution LiDAR data and advanced digital surface modeling techniques have been used for determining the available suitable rooftop area and estimating the technical solar PV potential of the C&I rooftops. For the C&I study area of Aversa Nord (South Italy), we find that the suitable rooftops have annually a total electric power potential of 50.75 $$ \text{GWh}/year $$ GWh / y e a r . For this area, an annual average PM2.5 concentrations of about 13 μg/m3 results in a nearly 5% annual solar insolation reduction. Thus, if properly located, the large scale rooftop PV systems could significantly decrease primary energy consumption and contribute to reduce the CO2 emissions.

G. Fattoruso, M. Nocerino, G. Sorrentino, V. Manna, M. Fabbricino, G. Di Francia

International Workshop on Automatic Landform Classification: Spatial Methods and Applications (ALCSMA 2020)

Frontmatter
Computer-Aided Geomorphic Seabed Classification and Habitat Mapping at Punta Licosa MPA, Southern Italy

Accurate seafloor maps serve as a critical component for understanding marine ecosystems and are essential for marine spatial planning, management of submerged cultural heritage and hazard risk assessments. In September 2001 the Marine Protected Area (MPA) of Punta Licosa has been mapped using a multibeam echosounder (MBES) and a side scan sonar (SSS) system in support of the Geosed project. Such seabed investigations has allowed for high-resolution bathymetric measurements and acoustic seafloor characterization through backscatter imagery.Based on visual interpretation of the data, the present study utilized a computer-aided seabed classification approach to map marine landform features and seabed composition of the study area. The results were then translated into a complete coverage geomorphologic map of the area to define benthic habitats. Offshore shelf plain make up more than half of the region (52.2%), with the terraces making up another 10.2% of the total area. Slopes make up a cumulative 30.1% of the study area. Scarp features comprise 4.3% while ridge features reach only 3.2% of the total study area. Benefits of the computer-aided seabed classification approach used in this study consisted in a fairly accurate geomorphic classification, while the effectiveness of a semi-automated approach for identifying substrate composition from backscatter data mostly relied on the level of acoustic artefacts present within the survey area.

Crescenzo Violante
Comparison of Different Methods of Automated Landform Classification at the Drainage Basin Scale: Examples from the Southern Italy

In this work, we tested the reliability of two different methods of automated landform classification (ACL) in three geological domains of the southern Italian chain with contrasting morphological features. ACL maps deriving from the TPI-based (topographic position index) algorithm are strictly dependent to the search input parameters and they are not able to fully capture landforms of different size. Geomorphons-based classification has shown a higher potential and can represent a powerful method of ACL, although it should be improved with the introduction of additional DEM-based parameters for the extraction of landform classes.

Dario Gioia, Maria Danese, Mario Bentivenga, Eva Pescatore, Vincenzo Siervo, Salvatore Ivo Giano
Tools for Semi-automated Landform Classification: A Comparison in the Basilicata Region (Southern Italy)

Recent advances in spatial methods of digital elevation model (DEMs) analysis have addressed many research topics on the assessment of morphometric parameters of the landscape. Development of computer algorithms for calculating the geomorphometric properties of the Earth’s surface has allowed for expanding of some methods in the semi-automatic recognition and classification of landscape features. In such a way, several papers have been produced, documenting the applicability of the landform classification based on map algebra. The Topographic Position Index (TPI) is one of the most widely used parameters for semi-automated landform classification using GIS software. The aim was to apply the TPI classes for landform classification in the Basilicata Region (Southern Italy). The Basilicata Region is characterized by an extremely heterogeneous landscape and geological features. The automated landform extraction, starting from two different resolution DEMs at 20 and 5 m-grids, has been carried out by using three different GIS software: Arcview, Arcmap, and SAGA. Comparison of the landform maps resulting from each software at a different scale has been realized, furnishing at the end the best landform map and consequently a discussion over which is the best software implementation of the TPI method.

Salvatore Ivo Giano, Maria Danese, Dario Gioia, Eva Pescatore, Vincenzo Siervo, Mario Bentivenga
High-Resolution LiDAR-Derived DEMs in Hydrografic Network Extraction and Short-Time Landscape Changes

In this paper an automatic methodology to extract the channel network from high-resolution LiDAR-derived DTMs and a semi-quantitative methodology to assess the short-time landscape evolution of a test-area, located in southern Italy, have been applied. In particular, the technique used is based on a local nonlinear filter together with the global geodesic optimization for channel head and drainage network extraction. Further, the two Lidar acquisition for the year 2012 and 2013 have been used to detect hydrographic network changes and slope evolution in terms of erosion and deposition pattern and then compare them with the slope processes (landslides and linear erosion).

Maurizio Lazzari

International Workshop on Advances of Modelling Micromobility in Urban Spaces (AMMUS 2020)

Frontmatter
Evaluation of Pedestrians’ Behavior and Walking Infrastructure Based on Simulation

Sustainable mobility mainly refers to cycling, walking, e-scooters, public transport etc. The increase of the usage of these means of transport against the use of motorised vehicles can improve the overall quality of the urban environment. Pedestrian streets play a significant role towards the direction of sustainable mobility and the facilitation of environment friendly daily trips. This research concerns one of the most important pedestrian streets in the centre of the city of Thessaloniki, Greece. The evaluation of the existing situation in the pedestrian street together with the evaluation of various scenarios concerning changes in the behaviour and direction of movement of pedestrians due to incidents is examined in the framework of this paper. The evaluation took place by using the pedestrian simulation software PTV Viswalk and four different scenarios were tested. The first scenario is the base scenario referring to the existing situation. The second scenario deals with an increase in the pedestrian flow due to unexpected events. The other two scenarios refer to evacuation phenomenon, thus preventing access to part of the pedestrian street. The examination concerns the pedestrian Level of Service (LOS) and the identification of critical segments of the pedestrian street in terms of pedestrians’ flow and composition. The simulation results show that even in the case of doubling of pedestrian flow, this will not cause an overall significant drop in LOS, except for specific sections of the pedestrian street. These results can assist the authorities so to minimize walking difficulties in the pedestrian street and at the same time to take care for waiting areas and the provision of optimum evacuation routes.

Tiziana Campisi, Socrates Basbas, Giovanni Tesoriere, Antonino Canale, Panagiotis Vaitsis, Dimitris Zeglis, Charilaos Andronis
The Evaluation of Home-School Itineraries to Improve Accessibility of a University Campus Trough Sustainable Transport Modes

Sustainable mobility is often related to the balance between supply and demand transport, including in its development in terms of connection between the behavioural and economic factors. Furthermore, the investigation of travel reason is useful for the aforementioned selection, also correlating with the different age groups and gender. There is a growing need for young people to access university campuses but the transport supply is often not adapted to student needs. This problem involves compromising not only accessible status but also sustainability and therefore the environmental, economic and social aspect. In this way, it emerges that is necessary to adapt to the growing transport services with student economic availability, travel distance and waiting times. This work explored the accessibility of a university campus (school node), through interviews considering a current transport offer and home–school reason. It was also linked to the availability of parking lots adjacent to the campus and the occupancy rates of the various neighbouring car parks were also calculated, monitoring the aforementioned areas for about a month using video cameras and sensors. The research investigated both travel distances in Euclidean and network terms and both travel times through the use of a micro-simulation tool and a linear equation. The study shows as a first research step that the shared mobility solution saves time but also highlights the critical issues of the service that should be better adapted to the students’ needs in terms of rate, type of car and subscription.

Antonino Canale, Tiziana Campisi, Giovanni Tesoriere, Luigi Sanfilippo, Alberto Brignone
Exploring the TTMS’s Impact to the Accessibility of a Long Distance Stretch Using Micro-simulation Approach

Road maintenance is generally analyzed considering complex types of work which must respect two considerable aspects: the first relating to the safety of workers, road users and those who somehow come into contact with the area on the other; the second instead correlated to the reduction of the impacts due to the shrinkage of the superstructure which entails criticalities to the transit vehicle flow. The evaluated parameters are the type of road, the position but also with the duration, visibility, speed and type of traffic. This work focuses on the evaluation of two different construction site layouts related to the TTMS (Temporary Traffic Management Scheme) which are periodically implemented in a medium-high traffic section of a small mountain town. The monitored area is located in one of the main connecting arteries of the city, adjacent to the main areas of attraction linked to the nearby commercial, residential and university areas. The study was addressed through the use of a micro-simulation tool. The results demonstrate how a different extension of the section to be maintained can drastically reduce the level of service (LOS), keeping equal vehicle flow.This approach is useful for road managers and local Authorities in order to analyze the impacts produced by the construction site in terms of increased congestion of the vehicle flow, evaluating this as the dimensions of the construction site vary. This preventive assessment aims to consider the best solution to be implemented in order not to interfere with other activities as this could further reduce the accessibility of the adjoining places.

Tiziana Campisi, Giovanni Tesoriere, Luigi Sanfilippo, Alberto Brignone, Antonino Canale
Understanding Willingness to Use Dockless Bike Sharing Systems Through Tree and Forest Analytics

In this paper we explore factors that affect Bike Sharing System (BSS) usage and how they differentiate between discrete groups of potential users. BSS have known a rampant growth during recent years, through technological advances, re-evaluated business models and reinvention of the mean’s utility. Yet, for a realized use of dockless BSS and a successful integration in the urban mobility ecosystem to be achieved, the factors that promote willingness to use them need to be explored. By using a sample of 500 stated preference data, classification trees and random forest models are built for three groups of potential BSS users; car users, bus users and pedestrians. Among the considered factors are BSS cost gains, BSS In Vehicle Time (IVT) and Out of Vehicle Time (OVT) gains, trip frequency, purpose and duration. More specific, it was found that BSS potential, increases for short duration trips of up to 21 min for car users. Bus users and pedestrians were found to be more likely to choose a BSS option for a higher cost up to 0,60 and 0,75 euros respectively. On the other side sociodemographic characteristics such as household income, gender, education level and occupation did not found to be the dominant factors for the mode choice decision. OVT is found only to be relatively important for bus users, while the cost gains are comparatively more significant for bus users and pedestrians.

Ioannis Politis, Ioannis Fyrogenis, Efthymis Papadopoulos, Anastasia Nikolaidou, Eleni Verani
An Ordered Logit Model for Predicting the Willingness of Renting Micro Mobility in Urban Shared Streets: A Case Study in Palermo, Italy

Sustainable transport modes, particularly micro-mobility, allows to reduce possible congestion phenomena in urban traffic. In this study, the aim is to make a contribution to increase micro-mobility use by exploring the impacts of socio-demographic, vehicle ownership (car, bicycle and micro mobility), level of infrastructure service and road users’ perception in safety, comfort and chaotic environment on renting micro-mobility in a shared urban street. The study area is a historical center called Via Maqueda in Palermo, Italy, which is rich in commercial and cultural activities. A survey with 200 individuals is carried out for the data collection regarding the aim of the study.The analysis starts with a descriptive statistics in order to illustrate the characteristics of the predictor variables. This is followed by relaxing p-value method for selecting the statistically significant predictor variables with 90% confidence level. These selected predictor variables are applied into an ordinal logit model. The results suggest that one unit increase in car ownership decrease the willingness of renting a micro mobility by log odds of −0.74, given all the other predictors are held constant. One unit increase in age group decrease the willingness of renting micro-mobility in shared urban streets. The outcomes will guide decision makers to understand who the average road users are and what are their needs in terms of further developments of the micro-mobility system in urban shared streets. The originality of this paper consists the perceptions of road users, such as safety and comfort, on micro-mobility that can encourage to use this sustainable urban travel mode in restricted traffic areas.

Tiziana Campisi, Nurten Akgün, Giovanni Tesoriere
Quantifying the Negative Impact of Interactions Between Users of Pedestrians-Cyclists Shared Use Space

In recent years, many efforts are being made for the promotion of active modes of transport. Due to this reason, as well as due to the limited available public space, the co-existence of pedestrians and cyclists is a very common phenomenon, which requires extensive investigation. The design and implementation of pedestrians-cyclists shared use space is a widely used technical choice, when the road infrastructure is unsuitable for hosting cyclists and thus the separation of cyclists from the motorized traffic is considered advisable. However, the co-existence of pedestrians and cyclists is not always harmonious and the interactions between them can have a negative impact in their perceived comfort and safety. The present research aims to quantify this negative impact of the various kinds of interactions, by considering the users’ attitudes and by applying the Analytical Hierarchy Process method. The attitudes of users were captured through questionnaire surveys, that were directed to both pedestrians and cyclists in the city of Palermo, Italy. The results of the analysis are being compared with the results of a previous attempt to quantify the impact of interactions. Through this comparison, useful conclusions and notes for further research are deriving.

Andreas Nikiforiadis, Socrates Basbas, Tiziana Campisi, Giovanni Tesoriere, Marina Iliana Garyfalou, Iasonas Meintanis, Thomas Papas, Mirto Trouva
Shared Mobility and Last-Mile Connectivity to Metro Public Transport: Survey Design Aspects for Determining Willingness for Intermodal Ridesharing in Athens

People living in peri-urban low-density areas may not choose the urban rail because they are hindered by the ‘first/last mile’ problem. The issue concerns poor bus feeder service to rail stations and/or congested Park&Ride facilities at respective intermodal hubs. Shared mobility in the form of carpooling is a viable alternative in connection to urban rail, especially when appropriate incentives and ridematching tools are effectuated. A multi-modal ride-matching app combining flexible (carpooling) and scheduled (rail and bus public transport) mobility is stipulated by the Horizon 2020 Ride2Rail project. Two intermodal hubs of urban rail along the 20 km-long corridor connecting the Athens basin with the Athens airport in Eastern Attica, Greece, are selected as a case study. The paper envisages the behavioural underpinning of combined rail-rideshare travel companion platform in the first/last mile context through the design of a SP experiment, as mode choice is concerned. All main access and egress modes of intermodal hubs are considered in the mode choice experiment, namely driving alone, using bus feeder, carpool driving and carpool riding. Tested parameters pertain particularly to incentive mechanisms increasing ridesharing to intermodal hubs, contextual preferences related to the trip purpose and perceived barriers of shared mobility.

Alexandros Deloukas, Georgios Georgiadis, Panagiotis Papaioannou
The Benefit of Engage the “Crowd” Encouraging a Bottom-up Approach for Shared Mobility Rating

Today transport systems have become multimodal and the evolution of shared mobility has made it possible to define forms of shared and green mobility such as e-bikes or PMVs (Personal Mobility Vehicles) while the supply of cars or van sharing reduces the use of private vehicles, often providing the possibility of renting electric vehicles with low environmental impact. When designing the shared mobility service it is useful to have a global vision that takes into account not only the business and future scenarios of the infrastructure, but also a data support linked to the needs of the users who will have to use it. Democratic and participatory planning makes it possible to directly involve the crowd and encourage a bottom-up approach to mobility. This provides the basis for different planning and design assessments. This work has focused on a statistical evaluation before and after and the results have been elaborated through a longitudinal approach, comparing the judgement before and after (2018–2020) the advent of car sharing in the city examined. The study highlights the weaknesses generated by the creation of the present car-sharing service implemented in a small town in Sicily (Italy). The results highlight the lack of a preventive analysis of transport demand and this criticality lays the basis for future research aimed at improving the supply of shared mobility by involving the majority of the population in the planning of the service and allowing the combined implementation of the car and bike sharing service.

Giovanni Tesoriere, Tiziana Campisi
i-CHANGE: A Platform for Managing Dockless Bike Sharing Systems

The new generation of bike-sharing services without docking stations is spreading around large cities of the world. The paper provides a technical specification of a platform, for managing a dockless bike sharing system. The bicycles of the platform are equipped with GPS devices and GPRS cards that can transmit, over the Internet, their exact location at any time. We collect and store all events derived from a user’s interaction with the system and in addition the trajectory points of a route during a rent order. The platform aims to fulfill the requirements of bikers, administrators and the research community through the collection, analysis and exploitation of bike sharing data.In the context of the platform, an app for smart devices is implemented for citizens to access the system. A dashboard is offered to the administrator as a valuable tool to inspect, promote the system and evaluate its usage. Last, all stored anonymised data can be accessible for further analysis by the research community through a REST API. The i-CHANGE platform is currently pilot tested in the city of Thessaloniki, Greece.

Lazaros Apostolidis, Symeon Papadopoulos, Maria Liatsikou, Ioannis Fyrogenis, Efthymis Papadopoulos, George Keikoglou, Konstantinos Alexiou, Nasos Chondros, Ioannis Kompatsiaris, Ioannis Politis
Bivariate Analysis of the Influencing Factors of the Upcoming Personal Mobility Vehicles (PMVs) in Palermo

The micro-mobility sector is spreading in the Italian and European urban context. The use of micro-mobility vehicles is often adopted to reach areas with particular transit restrictions or to avoid the problem of parking and congestion on the roads. Although personal mobility vehicles (PMV) are characterized by a growing technology, they still have problems related to driving safety in shared road spaces, not only for the inadequacy of infrastructure but also for some regulatory deficiencies and user behaviour. The Italian legislation is very recent and regulates the operational characteristics of the various vehicles and limits their use to certain age groups and in some areas of the cities. The present work focuses on the analysis of the attitudes and perceptions of a sample of users using micro-mobility in the centre of Palermo, one of the metropolises of Southern Italy. The results were obtained by administering questionnaires to a sample of specific users and the data were studied through a bivariate statistical analysis that highlights the significance of the comparison between two variables. The sample was chosen in collaboration with an association of citizens that promotes group activities by moving with means of micro-mobility in Palermo. Several correlations between the variables were addressed and among these some socio-economic ones were related to the propensity to rent and the perception of safety during the use of PMVs in Palermo. From this comparison, conclusions and notes useful for further research steps emerge.

Tiziana Campisi, Kh Md Nahiduzzaman, Dario Ticali, Giovanni Tesoriere
Understanding the Key Factors of Shared Mobility Services: Palermo as a Case Study

The potential success of shared mobility services in the urban area strongly depends on careful tariff planning, adequate sizing of the fleet and efficient integrated public transport system, as well as on the application of policies in favor of sustainable modes of transport. The balance between earnings and expenses is not always an easy target for the companies in those cities where these services are not well-rooted in the citizens’ mobility habits. Often only large operators in the sector can continue to offer a service generating profit. However, several factors can determine the success or the failure of shared mobility services. The objective of this study is to identify, thanks to the help of a case study, success and failure factors, developing an approach that is supportive for companies in managing the services and optimizing fares and fleet to increase the number of members and maximize profits. The city of Palermo has been chosen as a case study: the “Amigo” carsharing service - partly station-based, partly free-floating - is a service managed by the municipal company AMAT S.p.A., which operates also the public transport service.

Alessandro Emilio Capodici, Gabriele D’Orso, Marco Migliore

International Workshop on Advances in Information Systems and Technologies for Emergency Management, Risk Assessment and Mitigation Based on the Resilience Concepts (ASTER 2020)

Frontmatter
Typological Inventory of Residential Reinforced Concrete Buildings for the City of Potenza

The seismic vulnerability assessment of the built heritage located on a specific area represents an important starting point for both the evaluation of the consequences in the aftermath of significant seismic events and a proper management of the post-seismic reconstruction phase. In other words, the vulnerability assessment represents one of the main input elements for resilience analysis at urban scale. However, facing with a large-scale study, a building-specific assessment approach appears extremely difficult and time-consuming. In this optic, the definition of territorial-specific structural typologies and corresponding vulnerability classes represent a powerful tool for a rapid estimation of the “global vulnerability” of an examined area. As a matter of fact, the classification of the built heritage in a limited number of structural typologies (featuring similar characteristics) could sensibly reduce the complexity of the vulnerability assessment, hence resilience analysis, at urban scale.In this paper, an investigation on the built heritage of the city centre of Potenza (south of Italy) is proposed. In particular, the main typological and structural features of the residential Reinforced Concrete (RC) constructions, detected in the investigated territory, have been identified through an integrated approach involving: Census data, documentary analyses, site and virtual inspections (i.e. GIS-based analysis). The typological-structural characterization represents the first step of a comprehensive study, carried out within the PON-AIM 2014–2020 project, aimed at the evaluation of the seismic resilience of the examined area.

Amedeo Flora, Chiara Iacovino, Donatello Cardone, Marco Vona
Defining a Masonry Building Inventory for the City of Potenza

The seismic vulnerability assessment of masonry built heritage is an importance issue in Italy, due to the high seismic hazard of the territory and the huge amount of masonry buildings located in the exposed areas. The evaluation of the seismic performance of buildings can be useful to assess possible damages occurred after an earthquake, the direct costs which would result from it, the “pre” and “post” loss distribution in an urban area and the most vulnerable buildings. At urban scale, a building-specific assessment approach appears extremely difficult and time-consuming because of the large number of constructions. Therefore, being able to classify the built heritage in a limited number of territorial-specific structural typologies with similar characteristics would significantly simplify the vulnerability assessment.This paper shows a typological and structural classification of existing masonry buildings of the historic center of Potenza (south of Italy) aimed at designing a virtual city consisting of different buildings categories. The main typological and structural features of the masonry constructions (MUR) have been identified through documentary analyses, Census data, site investigation and GIS-based analysis by considering the frequency of significant structural parameters. This represents the starting point of a comprehensive research study, carried out within the PON-AIM 2014–2020 project, aimed at the evaluation of the seismic resilience of the investigated area.

Chiara Iacovino, Amedeo Flora, Donatello Cardone, Marco Vona
A Model to Mitigate the Peripheralization Risk at Urban Scale

The uncontrolled expansion of built-up areas and of multiple forms of poverty, on a global level, determines a new geography of degradation, extended both to suburbs and to central zones, thus exposing cities, in their entirety, at risk of peripheralization. In this framework, counteracting actions at the urban scale, such as regeneration programmes, need to be targeted primarily at areas of significant risk, occurring with a combination of vulnerability factors, in three dimensions: social, building and urban. Furthermore, in order to be effective, such programmes must be geared to maximizing risk mitigation. This is possible when the planning of interventions takes into account the evaluation results of the better design alternative in order to reduce pre-existing vulnerability. Such an approach constitutes the novelty of the study. So, the aim of the work is to provide an innovative model for the mitigation of peripheralization risk at urban scale. For this purpose, the contribution defines a set of mitigation indicators and a protocol for evaluating the most effective design alternative based on the Analytic Hierarchy Process (AHP).

Gerundo Roberto, Nesticò Antonio, Marra Alessandra, Carotenuto Maria
Changing from the Emergency Plan to the Resilience Plan: A Novel Approach to Civil Protection and Urban Planning

Seismic risk study and consequent mitigation strategies have become always more important. In modern communities are complex communities; therefore, there are complex processes based on strongly different topics: urban planning, socio-economic dynamics, the need to preserve cultural heritage, safety, and natural hazard effects. In last years, a significant change in point of view seems necessary. In particular, the seismic risk mitigation (and more generally, the natural risks mitigation) must become one of the main topic in urban planning and governance of communities. it is to be highlighted that in past earthquekas higher consequences and losses have been generally caused in the medium and long term by the low or lack of resilience of the communities. However, in recent years, the concrete application of the concept of the resilience have taken on a key role in seismic risk studies. In particular, the resilience assessment of the housing stock, both qualitatively and quantitatively can be considered the core of the problem. In this study, based on the concept of resilience a change in urban planning and emergency management tools is defined. A case study is considered to show a first application. The improvement of the resilience of the investigated town is defined on several strategies and then it is quantified.

Marco Vona
Local Geology and Seismic-Induced Damages: The Case of Amatrice (Central Italy)

On 24th August 2016 the first earthquake (Mw 6.2) of a long-lasting sequence struck Central Italy. The 24th August mainshock was in the surroundings of Amatrice, Central Italy, where about 300 people died. Most of the buildings were damaged and immediately after the earthquake Italian National Civil Protection (DPC) started coordinating the emergency and post-emergency activities. The latter included geological and geotechnical investigations for seismic microzonation carried out by the Centre for Seismic Microzonation (CMS) and the creation of a dedicated task force for rubbles management in the town of Amatrice. The present study presents preliminary results of the spatial correlation between the distribution of building damages generated by the 24th August earthquake, obtained by means of the Copernicus Emergency Management System (EMS) services, and the results of the seismic microzonation study of the village. We observed a spatial correlation between damages of buildings and seismic ground motion amplification, quantitatively estimated through an amplification factor (FHa). In particular, we observed an increasing trend of higher damaged buildings when FHa grows.

Sergio Cappucci, Giacomo Buffarini, Ludovica Giordano, Salomon Hailemikael, Guido Martini, Maurizio Pollino
Optimization of Low-Cost Monitoring Systems for On-Site Earthquake Early-Warning of Critical Infrastructures

In the last years, monitoring systems based on low-cost and miniaturized sensors (MEMS) revealed as a very successful compromise between the availability of data and their quality. Also applications in the field of seismic and structural monitoring have been constantly increasing in term of number and variety of functions. Among these applications, the implementation of systems for earthquake early warning is a cutting-edge topic, mainly for its relevance for the society as millions of peoples in various regions of the world are exposed to high seismic hazard. This paper introduces the optimization of an already established seismic (and structural) monitoring system, that would make it suitable for the implementation of the earthquake early warning. In particular, the sampling code has been improved and a new triggering algorithm able to automatically detect the ground shaking due to the propagation of the seismic waves has been developed. The preliminary results indicate that the system is very flexible and easy to implement, and encourage to perform further developing steps.

Antonino D’Alessandro, Salvatore Scudero, Giovanni Vitale, Andrea Di Benedetto, Giosuè Lo Bosco

International Workshop on Advances in Web Based Learning (AWBL 2020)

Frontmatter
Industry 4.0 Briefcase: An Innovative Engineering Outreach Project for Professions of the Future

This paper presents an engineering outreach project titled “Industry 4.0 Briefcase” that has been developed to introduce the concept of Industry 4.0 to undergraduate students. The project aims to support participants to become aware of Industry 4.0 related topics such as machine learning, data mining, industrial automation, human-machine interface, and product life cycle, and to stimulate the curiosity of them towards these topics. The scope of the project consists of presentations, experimental applications, observations, individual and collaborative studies, assessment and evaluation practices, e-learning applications, and a social program. The project was conducted as a one- week program at a public university in Turkey with the participation of 18 undergraduate 3rd-year students. Participants consist of students from computer engineering and software engineering departments of the engineering faculties and the business department of the faculty of economics and administrative sciences. Thus, participants of the project had the opportunity to exchange information with students and faculty members from different academic backgrounds. The study utilized the mixed methods approach by performing both quantitative and qualitative measurements. To collect data, mini projects and project evaluation forms were used for quantitative measurements and daily virtual classroom sessions and a general evaluation session (focus group interview) were used for qualitative measurements. The success rates of the participants based on the evaluation of the reports they presented were obtained as 91% in data mining, 95% in industrial automation and human-machine interface, and 89% in machine learning course, respectively. It was observed that the overall satisfaction level of the participants from the project activities was over 95%. These findings were also supported by the qualitative findings as the students indicated their overall satisfaction from the organization of the project and stated that working in teams and attending the social program contributed to developing positive relationships with each other and increasing their success.

Mustafa M. Inceoglu, Birol Ciloglugil
Investigating the Effect of a Mobile Learning Application for Graph Theory Education on Academic Performance

In this study, the effectiveness of using a mobile learning application developed for graph theory education is investigated by evaluating its impact on the academic performance of students for three academic years. The mobile application supports graph operations such as creating/editing graphs, running basic graph algorithms on graphs, and observing their step-by-step execution. Students can also test their knowledge level by using the quiz section of the mobile learning application. The study is conducted at a public university in Turkey with the participation of voluntary freshman students that take the Discrete Structures course. Graph theory is lectured for three weeks as the final subject of the course. Various features of the mobile application were introduced to the students by the lab assistant at laboratory sessions of 45 min that were conducted for a period of three weeks after the theoretical lectures in which the graph theory topics were covered. After these introduction sessions, students were able to use the mobile application as a supplementary tool any time they want. In order to investigate the effect of using the mobile learning application on the academic performance of students, a quasi-experimental research design with experimental and control groups was utilized. Quantitative data were collected by using the grades students achieved from the questions related to graph theory in the final exams of the course. The data for each year was analyzed by performing independent groups t-tests. The results of the statistical analyzes indicated that the usage of the mobile learning application contributed statistically significantly to the grades of the students in the experimental group for each year it was applied.

Birol Ciloglugil, Mustafa Murat Inceoglu
Backmatter
Metadaten
Titel
Computational Science and Its Applications – ICCSA 2020
herausgegeben von
Prof. Dr. Osvaldo Gervasi
Beniamino Murgante
Prof. Sanjay Misra
Dr. Chiara Garau
Ivan Blečić
David Taniar
Dr. Bernady O. Apduhan
Ana Maria A.C. Rocha
Prof. Eufemia Tarantino
Prof. Carmelo Maria Torre
Prof. Yeliz Karaca
Copyright-Jahr
2020
Electronic ISBN
978-3-030-58802-1
Print ISBN
978-3-030-58801-4
DOI
https://doi.org/10.1007/978-3-030-58802-1

Premium Partner