Skip to main content

2022 | Buch

Computational Science and Its Applications – ICCSA 2022 Workshops

Malaga, Spain, July 4–7, 2022, Proceedings, Part II

herausgegeben von: Prof. Dr. Osvaldo Gervasi, Beniamino Murgante, Sanjay Misra, Ana Maria A. C. Rocha, Dr. Chiara Garau

Verlag: Springer International Publishing

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

The eight-volume set LNCS 13375 – 13382 constitutes the proceedings of the 22nd International Conference on Computational Science and Its Applications, ICCSA 2022, which was held in Malaga, Spain during July 4 – 7, 2022.

The first two volumes contain the proceedings from ICCSA 2022, which are the 57 full and 24 short papers presented in these books were carefully reviewed and selected from 279 submissions.

The other six volumes present the workshop proceedings, containing 285 papers out of 815 submissions. These six volumes includes the proceedings of the following workshops:


Advances in Artificial Intelligence Learning Technologies: Blended Learning, STEM, Computational Thinking and Coding (AAILT 2022); Workshop on Advancements in Applied Machine-learning and Data Analytics (AAMDA 2022); Advances in information Systems and Technologies for Emergency management, risk assessment and mitigation based on the Resilience (ASTER 2022); Advances in Web Based Learning (AWBL 2022); Blockchain and Distributed Ledgers: Technologies and Applications (BDLTA 2022); Bio and Neuro inspired Computing and Applications (BIONCA 2022); Configurational Analysis For Cities (CA Cities 2022); Computational and Applied Mathematics (CAM 2022), Computational and Applied Statistics (CAS 2022); Computational Mathematics, Statistics and Information Management (CMSIM); Computational Optimization and Applications (COA 2022); Computational Astrochemistry (CompAstro 2022); Computational methods for porous geomaterials (CompPor 2022); Computational Approaches for Smart, Conscious Cities (CASCC 2022); Cities, Technologies and Planning (CTP 2022); Digital Sustainability and Circular Economy (DiSCE 2022); Econometrics and Multidimensional Evaluation in Urban Environment (EMEUE 2022); Ethical AI applications for a human-centered cyber society (EthicAI 2022); Future Computing System Technologies and Applications (FiSTA 2022); Geographical Computing and Remote Sensing for Archaeology (GCRSArcheo 2022); Geodesign in Decision Making: meta planning and collaborative design for sustainable and inclusive development (GDM 2022); Geomatics in Agriculture and Forestry: new advances and perspectives (GeoForAgr 2022); Geographical Analysis, Urban Modeling, Spatial Statistics (Geog-An-Mod 2022); Geomatics for Resource Monitoring and Management (GRMM 2022); International Workshop on Information and Knowledge in the Internet of Things (IKIT 2022); 13th International Symposium on Software Quality (ISSQ 2022); Land Use monitoring for Sustanability (LUMS 2022); Machine Learning for Space and Earth Observation Data (MALSEOD 2022); Building multi-dimensional models for assessing complex environmental systems (MES 2022); MOdels and indicators for assessing and measuring the urban settlement deVElopment in the view of ZERO net land take by 2050 (MOVEto0 2022); Modelling Post-Covid cities (MPCC 2022); Ecosystem Services: nature’s contribution to people in practice. Assessment frameworks, models, mapping, and implications (NC2P 2022); New Mobility Choices For Sustainable and Alternative Scenarios (NEMOB 2022); 2nd Workshop on Privacy in the Cloud/Edge/IoT World (PCEIoT 2022); Psycho-Social Analysis of Sustainable Mobility in The Pre- and Post-Pandemic Phase (PSYCHE 2022); Processes, methods and tools towards RESilient cities and cultural heritage prone to SOD and ROD disasters (RES 2022); Scientific Computing Infrastructure (SCI 2022); Socio-Economic and Environmental Models for Land Use Management (SEMLUM 2022); 14th International Symposium on Software Engineering Processes and Applications (SEPA 2022); Ports of the future - smartness and sustainability (SmartPorts 2022); Smart Tourism (SmartTourism 2022); Sustainability Performance Assessment: models, approaches and applications toward interdisciplinary and integrated solutions (SPA 2022); Specifics of smart cities development in Europe (SPEED 2022); Smart and Sustainable Island Communities (SSIC 2022); Theoretical and Computational Chemistryand its Applications (TCCMA 2022); Transport Infrastructures for Smart Cities (TISC 2022); 14th International Workshop on Tools and Techniques in Software Development Process (TTSDP 2022); International Workshop on Urban Form Studies (UForm 2022); Urban Regeneration: Innovative Tools and Evaluation Model (URITEM 2022); International Workshop on Urban Space and Mobilities (USAM 2022); Virtual and Augmented Reality and Applications (VRA 2022); Advanced and Computational Methods for Earth Science Applications (WACM4ES 2022); Advanced Mathematics and Computing Methods in Complex Computational Systems (WAMCM 2022).

Inhaltsverzeichnis

Frontmatter

International Workshop on Computational Optimization and Applications (COA 2022)

Frontmatter
External Climate Data Extraction Using the Forward Feature Selection Method in the Context of Occupational Safety

Global climate changes and the increase in average temperatures are some of the major contemporary problems that have not been considered in the context of external factors to increase accident risk. Studies that include climate information as a safety parameter in machine learning models designed to predict the occurrence of accidents are not usual. This study aims to create a dataset with the most relevant climatic elements, to get better predictions. The results will be applied in future studies to correlate with the accident history in a retail sector company to understand its impact on accident risk. The information was collected from the National Oceanic and Atmospheric Administration (NOAA) climate database and computed by a wrapper method to ensure the selection of the most features. The main goal is to retain all the features in the dataset without causing significant negative impacts on the prediction score.

Felipe G. Silva, Inês Sena, Laires A. Lima, Florbela P. Fernandes, Maria F. Pacheco, Clara B. Vaz, José Lima, Ana I. Pereira
Dynamic Analysis of the Sustainable Performance of Electric Mobility in European Countries

As part of the ongoing climate and energy framework, the European Commission raised recently the 2030 greenhouse gas emission reduction target, moving towards a climate-neutral economy. Transportation represents almost a quarter of Europe’s greenhouse gas emissions, and it is the remaining sector with increasing emissions, above 1990 levels. Considering also the evolving necessity for the reduction of fossil fuels dependency, Europe’s strategy has been designed to support an irreversible shift toward low-emission electric mobility. In this context, the present work assesses the performance of electric mobility in European countries, by using a dynamic analysis in the period 2015–2019, framed in four sustainable dimensions, economy, technology, environment and society. The methodology aggregates several sub-indicators in a composite indicator by using the Data Envelopment Analysis, and evaluates the dynamic change in the sustainable performance through the biennial Malmquist index. Main results indicate that the total productivity change has been improved mainly due to the progression of the frontier that has been observed for all countries from 2018. However, an increasing number of countries have had more difficulties to adopt the best sustainable electric mobility practices, being necessary to design strategies to promote them, mainly in underperforming countries.

Clara B. Vaz, Ângela P. Ferreira

Open Access

On Computational Procedures for Optimising an Omni-Channel Inventory Control Model

Dynamic programming (DP) and specifically Markov Decision Problems (MDP) are often seen in inventory control as a theoretical path towards optimal policies, which are (often) not tractable due to the curse of dimensionality. A careful bounding of decision and state space and use of resources may provide the optimal policy for realistic instances despite the dimensionality of the problem. We will illustrate this process for an omni-channel inventory control model where the first dimension problem is to keep track of the outstanding ordered quantities and the second dimension is to keep track of items sold online that can be returned.

Joost Goedhart, Eligius M. T. Hendrix
A Bibliometric Review and Analysis of Traffic Lights Optimization

The significant increase in the number of vehicles in urban areas emerges the challenge of urban mobility. Researchers in this area suggest that most daily delays in urban travel times are caused by intersections, which could be reduced if the traffic lights at these intersections were more efficient. The use of simulation for real intersections can be effective in optimizing the cycle times and improving the traffic light timing to coordinate vehicles passing through intersections. From these themes emerge the research questions: How are the existing approaches (optimization techniques and simulation) to managing traffic lights smartly? What kind of data (offline and online) are used for traffic lights optimization? How beneficial is it to propose an optimization approach to the traffic system? This paper aims to answer these questions, carried out through a bibliometric literature review. In total, 93 articles were analyzed. The main findings revealed that the United States and China are the countries with the most studies published in the last ten years. Moreover, Particle Swarm Optimization is a frequently used approach, and there is a tendency for studies to perform optimization of real cases by real-time data, showing that the praxis of smart cities has resorted to smart traffic lights.

Gabriela R. Witeck, Ana Maria A. C. Rocha, Gonçalo O. Silva, António Silva, Dalila Durães, José Machado
A Genetic Algorithm for Forest Firefighting Optimization

In recent years, a large number of fires have ravaged planet Earth. A forest fire is a natural phenomenon that destroys the forest ecosystem in a given area. There are many factors that cause forest fires, for example, weather conditions, the increase of global warming and human action. Currently, there has been a growing focus on determining the ignition sources responsible for forest fires. Optimization has been widely applied in forest firefighting problems, allowing improvements in the effectiveness and speed of firefighters’ actions. The better and faster the firefighting team performs, the less damage is done. In this work, a forest firefighting resource scheduling problem is formulated in order to obtain the best ordered sequence of actions to be taken by a single firefighting resource in combating multiple ignitions. The objective is to maximize the unburned area, i.e., to minimize the burned area caused by the ignitions. A problem with 10 fire ignitions located in the district of Braga, in Portugal, was solved using a genetic algorithm. The results obtained demonstrate the usefulness and validity of this approach.

Marina A. Matos, Ana Maria A. C. Rocha, Lino A. Costa, Filipe Alvelos
On Tuning the Particle Swarm Optimization for Solving the Traffic Light Problem

In everyday routines, there are multiple situations of high traffic congestion, especially in large cities. Traffic light timed regulated intersections are one of the solutions used to improve traffic flow without the need for large-scale and costly infrastructure changes. A specific situation where traffic lights are used is on single-lane roads, often found on roads under maintenance, narrow roads or bridges where it is impossible to have two lanes. In this paper, a simulation-optimization strategy is tested for this scenario. A Particle Swarm Optimization algorithm is used to find the optimal solution to the traffic light timing problem in order to reduce the waiting times for crossing the lane in a simulated vehicle system. To assess vehicle waiting times, a network is implemented using the Simulation of Urban MObility software. The performance of the PSO is analyzed by testing different parameters of the algorithm in solving the optimization problem. The results of the traffic light time optimization show that the proposed methodology is able to obtain a decrease of almost 26% in the average waiting times.

Gonçalo O. Silva, Ana Maria A. C. Rocha, Gabriela R. Witeck, António Silva, Dalila Durães, José Machado
A Reactive GRASP Algorithm for the Multi-depot Vehicle Routing Problem

The vehicle routing problem (VRP) is a well know hard to solve problem in literature. In this paper, we describe a reactive greedy randomized adaptive search procedures algorithm, for short, reactive GRASP, using a variable neighborhood descent (VND) algorithm as local search procedure to solve the multi-depot vehicle routing problem (MDVRP). This algorithm, called RGRASP+VND, combines four distinct local search procedures and a clustering technique. The Cordeau et al. dataset, a widely well known MDVRP benchmark, is considered for the experimental tests. RGRASP+VND achieves better results on most small instances and a lower average solution for all instances on the experimental tests when compared to the earlier GRASP approaches in the MDVRP literature.

Israel Pereira de Souza, Maria Claudia Silva Boeres, Renato Elias Nunes de Moraes, João Vinicius Corrêa Thompson
How Life Transitions Influence People’s Use of the Internet: A Clustering Approach

This research aimed, firstly, to define a conceptual model that considers potential resources/challenges (Physical, Cognitive, Emotional, Social, Material, Environmental, Digital) and describes how those influence the Internet use and modify human behavior during life transitions (e.g., changing school, finding a job). Secondly, starting on that model, user profiles were outlined. Instead of grouping study participants into pre-defined groups, clustering techniques were used to group users with similar profiles. The main advantage of this methodological approach is that the participant groups, i.e., different user profiles, emerged intrinsically from the data. A cross-sectional study was proposed based on the compilation of an Online questionnaire. The sample consists of 1.524 participants. Three clusters emerged with different mean ages: young adult users (mean age = 33.83), youngest users (25.79), and oldest users (36.80). Differences were identified between all dimensions measured, particularly between youngest users and oldest users.

Martina Benvenuti, Humberto Rocha, Isabel Dórdio Dimas, Elvis Mazzoni

Open Access

On Monotonicity Detection in Simplicial Branch and Bound over a Simplex

The concept of exploiting proven monotonicity for dimension reduction and elimination of partition sets is well known in the field of Interval Arithmetic Branch and Bound (B &B). Part of the concepts can be applied in simplicial B &B over a box. The focus of our research is here on minimizing a function over a lower simplicial dimension feasible set, like in blending and portfolio optimization problems. How can monotonicity be detected and be exploited in a B &B context? We found that feasible directions can be used to derive bounds on the directional derivative. Specifically, Linear Programming can be used to detect the sharpest bounds.

L. G. Casado, B. G.-Tóth, E. M. T. Hendrix, F. Messine
Virtual Screening Based on Electrostatic Similarity and Flexible Ligands

Virtual Screening (VS) is a technique aimed at reducing the time and budget required when working on drug discovery campaigns. The idea consists of applying computational procedures to prefilter databases to a subset of potential compounds, to be characterized experimentally in later phases.The problem lies in the fact that the current VS methods make simplifications, meaning they are not exhaustive. One particular common simplification is to consider the molecules as rigid. Such an assumption greatly reduces the computational complexity of the optimization problem to be solved, but it may result in poor or inefficient predictions. In this work, we have extended the features of Optipharm, a recently developed piece of software, by applying a methodology that considers the flexibility of the molecules. The new OptiPharm has several strengths over its previous version. More precisely, (i) it includes a prefilter based on molecule descriptors, (ii) simulates molecule flexibility by computing different poses for each rotatable bond, (iii) reduces the search space dimension, and (iv) introduces circular limits for the angular variables to enhance searchability. As the results show, these improvements help OptiPharm to achieve better predictions.

Savíns Puertas-Martín, Juana L. Redondo, Antonio J. Banegas-Luna, Ester M. Garzón, Horacio Pérez-Sánchez, Valerie J. Gillet, Pilar M. Ortigosa
Solving a Capacitated Waste Collection Problem Using an Open-Source Tool

Increasing complexity in municipal solid waste streams worldwide is pressing Solid Waste Management Systems (SWMS), which need solutions to manage the waste properly. Waste collection and transport is the first task, traditionally carried out by countries/municipalities responsible for waste management. In this approach, drivers are responsible for decision-making regarding collection routes, leading to inefficient resource expenses. In this sense, strategies to optimize waste collection routes are receiving increasing interest from authorities, companies and the scientific community. Works in this strand usually focus on waste collection route optimization in big cities, but small towns could also benefit from technological development to improve their SWMS. Waste collection is related to combinatorial optimization that can be modeled as the capacitated vehicle routing problem. In this paper, a Capacitated Waste Collection Problem will be considered to evaluate the performance of metaheuristic approaches in waste collection optimization in the city of Bragança, Portugal. The algorithms used are available on Google OR-tools, an open-source tool with modules for solving routing problems. The Guided Local Search obtained the best results in optimizing waste collection planning. Furthermore, a comparison with real waste collection data showed that the results obtained with the application of OR-Tools are promising to save resources in waste collection.

A. S. Silva, Filipe Alves, J. L. Diaz de Tuesta, Ana Maria A. C. Rocha, A. I. Pereira, A. M. T. Silva, Paulo Leitão, H. T. Gomes
A Systematic Literature Review About Multi-objective Optimization for Distributed Manufacturing Scheduling in the Industry 4.0

Multi-objective optimization problems are frequent in many engineering problems, namely in distributed manufacturing scheduling. In the current Industry 4.0 this kind of problems are becoming even more complex, due to the increase in data sets arising from the industry, thus requiring appropriate methods to solve them, in real time. In this paper, the results of a Systematic Literature Review are presented to reveal the state of the art in this scientific domain and identify the main research gaps in the current digitalization era. The results obtained allow to realize the importance of the multi-objective optimization approaches. Typically, when addressing large scale real problems, the existence of many objectives usually benefits with the establishment of some level of trade-off between objectives. In this paper, a summarized description and analysis is presented, related to several main issues arising currently in companies requiring the application of multi-objective optimization based distributed scheduling, for enabling them to fulfill requisites imposed by the Industry 4.0. In this context, issues related to energy consumption, among other customer-oriented objectives are focused to enable properly support decision-making through the analysis of a set of 33 main publications.

Francisco dos Santos, Lino A. Costa, Leonilde Varela
On Active-Set LP Algorithms Allowing Basis Deficiency

An interesting phenomenon in linear programming (LP) is how to deal with solutions in which the number of nonzero variables is less than the number of rows of the matrix in standard form. An interesting approach is that of basis-deficiency-allowing (BDA) simplex variations, which work with a subset of independent columns of the coefficient matrix in standard form, where the basis is not necessarily represented by a square matrix. By considering a different view on the usual dual-primal non-symmetric interaction, our aim is to show a relation between BDA and the non-simplex active-set methods. The whole is illustrated by several numerical examples. The ideas may aid the understanding of nowadays BDA approaches to sparse implementation and dealing with the Phase I.

Pablo Guerrero-García, Eligius M. T. Hendrix

Open Access

On the Design of a New Stochastic Meta-Heuristic for Derivative-Free Optimization

Optimization problems are frequent in several fields, such as the different branches of Engineering. In some cases, the objective function exposes mathematically exploitable properties to find exact solutions. However, when it is not the case, heuristics are appreciated. This situation occurs when the objective function involves numerical simulations and sophisticated models of reality. Then, population-based meta-heuristics, such as genetic algorithms, are widely used because of being independent of the objective function. Unfortunately, they have multiple parameters and generally require numerous function evaluations to find competitive solutions stably. An attractive alternative is DIRECT, which handles the objective function as a black box like the previous meta-heuristics but is almost parameter-free and deterministic. Unfortunately, its rectangle division behavior is rigid, and it may require many function evaluations for degenerate cases. This work presents an optimizer that combines the lack of parameters and stochasticity for high exploration capabilities. This method, called Tangram, defines a self-adapted set of division rules for the search space yet relies on a stochastic hill-climber to perform local searches. This optimizer is expected to be effective for low-dimensional problems (less than 20 variables) and few function evaluations. According to the results achieved, Tangram outperforms Teaching-Learning-Based Optimization (TLBO), a widespread population-based method, and a plain multi-start configuration of the stochastic hill-climber used.

N. C. Cruz, Juana L. Redondo, E. M. Ortigosa, P. M. Ortigosa
Analyzing the MathE Platform Through Clustering Algorithms

University lecturers have been encouraged to adopt innovative methodologies and teaching tools in order to implement an interactive and appealing educational environment. The MathE platform was created with the main goal of providing students and teachers with a new perspective on mathematical teaching and learning in a dynamic and appealing way, relying on digital interactive technologies that enable customized study. The MathE platform has been online since 2019, having since been used by many students and professors around the world. However, the necessity for some improvements on the platform has been identified, in order to make it more interactive and able to meet the needs of students in a customized way. Based on previous studies, it is known that one of the urgent needs is the reorganization of the available resources into more than two levels (basic and advanced), as it currently is. Thus, this paper investigates, through the application of two clustering methodologies, the optimal number of levels of difficulty to reorganize the resources in the MathE platform. Hierarchical Clustering and three Bio-inspired Automatic Clustering Algorithms were applied to the database, which is composed of questions answered by the students on the platform. The results of both methodologies point out six as the optimal number of levels of difficulty to group the resources offered by the platform.

Beatriz Flamia Azevedo, Yahia Amoura, Ana Maria A. C. Rocha, Florbela P. Fernandes, Maria F. Pacheco, Ana I. Pereira
A Tabu Search with a Double Neighborhood Strategy

Tabu Search (TS) is a well known and very successful method heuristic approach for hard optimization problems, linear or nonlinear. It is known to produce very good solutions, optimal or close to optimal for some hard combinatorial optimization problems. The drawback is that we have in general no optimality certificate, but this is the price to be paid for problems where the exact methods are too costly in terms of time and computational memory. One of the drawbacks of TS is that the strategies must be studied and refined for every instance. Many proposals exist to enhance the efficiency of this method heuristic. TS is particularly fragile in cases where there are many local optima of the problem. TS may be slow in the process of escaping a region of attraction of a local optima. If the strategy for evaluating the solutions in the neighborhood takes time to move away from the local optimum then it may compromise the search efficiency. In this paper we propose a double neighborhood strategy with opposite optimization directions (minimization and maximization). While one search for the best solution in the neighborhood the second search for the worse, and two parallel process develop switching from the minimization to maximization and vice-versa, when in consecutive iterations there is no improvement in solution. With this proposal, it is intended that the research can escape the attraction zone of a local optimum, more quickly allowing the research space to be better explored. We present an application to a Knapsack problem.

Paula Amaral, Ana Mendes, J. Miguel Espinosa

International Workshop on Computational Astrochemistry (CompAs-tro 2022)

Frontmatter
The S(S)+SiH(A) Reaction: Toward the Synthesis of Interstellar SiS

We have performed a theoretical investigation of the S $$^+$$ + ( $$^4$$ 4 S) + SiH $$_{2}$$ 2 ( $$^1$$ 1 A $$_1$$ 1 ) reaction, a possible formation route of the HSiS $$^+$$ + and SiSH $$^+$$ + cations that are alleged to be precursors of interstellar silicon sulfide, SiS. Electronic structure calculations allowed us to characterize the relevant features of the potential energy surface of the system and identify the reaction pathways. The reaction has two exothermic channels leading to the isomeric species $$^3$$ 3 HSiS $$^{+}$$ + and $$^3$$ 3 SiSH $$^{+}$$ + formed in conjunction with H atoms. The reaction is not characterized by an entrance barrier and, therefore, it is expected to be fast also under the very low temperature conditions of insterstellar clouds. The two ions are formed in their first electronically excited state because of the spin multiplicity of the overall potential energy surface. In addition, following the suggestion that neutral species are formed by proton transfer of protonated cations to ammonia, we have derived the potential energy surface for the reactions $$^3$$ 3 HSiS $$^{+}$$ + / $$^3$$ 3 SiSH $$^{+}$$ + +NH $$_{3}$$ 3 ( $$^{1}$$ 1 A $$_1$$ 1 ).

Luca Mancini, Marco Trinari, Emília Valença Ferreira de Aragão, Marzio Rosi, Nadia Balucani
A Theoretical Investigation of the Reactions of N(D) and CN with Acrylonitrile and Implications for the Prebiotic Chemistry of Titan

The reactions between acrylonitrile and two different reactive species, namely N( $$^2$$ 2 D) and the CN radical were investigated by performing accurate electronic structure calculations with the aim to unveil the most important aspects of the Potential Energy Surfaces. For each reaction, several product channels involving the elimination of H atoms were identified, allowing the formation of different radical species, depending on the initial site of attack. Both reactions appears to be exothermic and without an entrance barrier, suggesting their possible efficient role in the nitrogen-rich chemistry of the atmosphere of Titan.

Luca Mancini, Emília Valença Ferreira de Aragão, Gianmarco Vanuzzo
Formation Routes of CO from O(1D)+Toluene: A Computational Study

The interaction between oxygen atoms in their first electronically excited state 1D with toluene has been characterized by electronic structure calculations. We focused our attention, in particular, on the different pathways leading to the formation of CO. Six different reaction channels have been investigated. Our results suggest that, while for accurate energies high level calculations, as CCSD(T), are necessary, in particular when strong correlation effects are present, for semi-quantitative results DFT methods are adequate and provide information useful when larger systems than toluene as polycyclic aromatic hydrocarbons are under investigation.

Marzio Rosi, Piergiorgio Casavecchia, Nadia Balucani, Pedro Recio, Adriana Caracciolo, Dimitrios Skouteris, Carlo Cavallotti
Stereo-Dynamics of Autoionization Reactions Induced by Ne*(3P0,2) Metastable Atoms with HCl and HBr Molecules: Experimental and Theoretical Study of the Reactivity Through Selective Collisional Angular Cones

In this paper are presented mass spectrometric determinations as a function of the collision energy in the 0.03–0.50 eV range as recorded in a crossed molecular beam experiments involving autoionization reactions between Ne*(3P2,0) metastable atoms and HCl and HBr molecules. The total and partial ionization cross sections for both investigated systems are presented and discussed in a comparative way. The comparison of the recorded data allows to point out similarities and differences on the collisional stereodynamics of Ne*(3P0,2)-HCl and Ne*(3P0,2)-HBr systems. In particular, an accurate characterization of the interaction potentials, which is mandatory for a comprehensive description of Ne*-HX (X = Cl and Br) reactive collisions, has been outlined. Such a theoretical analysis suggests that the formation of the proton transfer, NeH+, ions as well as of other possible product ions (i.e. HX+ and NeHX+, parent and associate ions, respectively) comes from reactivity that is selectively open along angular cones showing different orientation and acceptance. In particular, the performed analysis highlights that the proton transfer rearrangement reaction, which is open in both Ne*-HX autoionizing collision, is much more efficient for Ne*+HCl respect to Ne*+HBr autoionization. The present investigation points out that such an efficiency variation is related to the following crucial points: (i) the different charge distribution on HX+ ionic products, (ii) the balance between two distinct microscopic mechanisms that are operative in such processes (a pure physical-photoionization-indirect mechanism and a chemical-oxidation-direct mechanism), which are reactions of interest in combustion chemistry, plasma physics and chemistry, as well as in astrochemistry and for the chemistry of planetary ionospheres.

Marco Parriani, Franco Vecchiocattivi, Fernando Pirani, Stefano Falcinelli

Open Access

An Ab Initio Computational Study of Binding Energies of Interstellar Complex Organic Molecules on Crystalline Water Ice Surface Models

The interstellar medium is extremely heterogeneous in terms of physical environments and chemical composition. Spectroscopic observations in the recent decades have revealed the presence of gaseous material and dust grains covered in ices predominantly of water in interstellar clouds, the interplay of which may elucidate the existence of more than 250 molecular species. Of these species of varied complexity, several terrestrial carbon-containing compounds have been discovered, known as interstellar complex organic molecules (iCOMs) in the astrochemical argot. In order to investigate the formation of iCOMs, it is crucial to explore gas-grain chemistry and in this regard, one of the fundamental parameters is the binding energy (BE), which is an essential input in astrochemical models. In this work, the BEs of 13 iCOMs on a crystalline H2O-ice surface have been computed by means of quantum chemical periodic calculations. The hybrid B3LYP-D3 DFT method was used for the geometry optimizations of the adsorbate/ice systems and for computing the BEs. Furthermore, to refine the BE values, an ONIOM2-like approximation has been employed to obtain them at CCSD(T), which correlate well with those obtained at B3LYP-D3. Additionally, aiming to lower the computational cost, structural optimizations were carried out using the HF-3c level of theory, followed by single point energy calculations at B3LYP-D3 in order to obtain BE values comparable to the full DFT treatment.

Harjasnoor Kakkar, Berta Martínez-Bachs, Albert Rimola

International Workshop on Computational Methods for Porous Geo-materials (CompPor 2022)

Frontmatter
Optimization of the Training Dataset for Numerical Dispersion Mitigation Neural Network

We present an approach to construct the training dataset for the numerical dispersion mitigation network (NDM-net). The network is designed to suppress numerical error in the simulated seismic wavefield. The training dataset is the wavefield simulated using a fine grid, thus almost free from the numerical dispersion. Generation of the training dataset is the most computationally intense part of the algorithm, thus it is important to reduce the number of seismograms used in the training dataset to improve the efficiency of the NDM-net. In this work, we introduce the discrepancy between seismograms and construct the dataset, so that the discrepancy between the dataset and any seismogram is below the prescribed level.

Kirill Gadylshin, Vadim Lisitsa, Kseniia Gadylshina, Dmitry Vishnevsky
Numerical Solution of Anisotropic Biot Equations in Quasi-static State

Frequency-dependent seismic attenuation can be applied to indicate transport properties of fractured media and fluid mobility within it. In particular, wave-induced fluid flow (WIFF) appears during seismic wave propagation between fractures and background as well as within interconnected fractures, and causes intensive attenuation. We present effective algorithm for numerical upscaling to estimate attenuation in anisotropic fractured porous fluid-saturated media. Algorithm is based on numerical solution of quasi-static Biot equations using finite-difference approximation. Presented algorithm is used to estimate seismic attenuation in fractured media with high fracture connectivity. Results of numerical experiments demonstrate the influence of physical properties and microscale anisotropy of fracture-filling material on seismic attenuation.

Sergey Solovyev, Mikhail Novikov, Vadim Lisitsa
Effect of the Interface Roughness on the Elastic Moduli

In this paper, we study the effect of the interface roughness on the elastic parameters of layered media. We consider the three-dimensional models of a layered medium with two different elastic materials inside and outside the layer. We generate the first class of models, where the interfaces between the layers are rough, and the elastic parameters of the inner layers are fixed. Then, the numerical upscaling technique is applied to estimate the effective stiffness tensor. Next, we downscale the stiffness tensor to reconstruct the new elastic parameters of the inner layer for the model of second class with flat interfaces; that is the uncertainty of the model geometry is mapped to the uncertainty of the stiffness tensor component for a fixed geometry of the model. After that, we propose an algorithm for extending the results of restoring the elastic tensors for arbitrary parameters of uncertainty applying the bilinear regression with respect to interface rough parameters and bilinear interpolation using two nearest points with respect to the physical parameters of the inner layers. Verification of the algorithm shows that the errors in the recovering covariance matrix do not exceed 7%; that is, it can be used to statistically simulate models of the second class with a flat interface by arbitrary values of the interface roughness and the physical parameters of the layers in the first class of models.

Tatyana Khachkova, Vadim Lisitsa, Dmitry Kolyukhin

International Workshop on Computational Science and HPC (CSHPC 2022)

Frontmatter
Design and Implementation of an Efficient Priority Queue Data Structure

Priority queues are among the most useful of all data structures. Existing priority queues have a vast amount of overhead associated with them. There is a need to have a simple data structure that can be used as a priority queue with low overhead. The data structure should have the operation where the data item with the minimum/maximum value is the next item to be deleted. The data structure should also support the function of a calendar queue where elements with the same or similar priority have the same key. For example, all of today’s appointments will have today’s date as their key. To that end, a bucket data structure has been developed that has both of these features. We address the functionality and efficiency of the data structure for the applications of adaptive multivariate integration and the 15-puzzle. In adaptive multivariate integration, the key is an error estimate, which is a floating-point number. The data for this application has many items with similar keys and the maximum is the next item to be deleted. The key for the 15-puzzle is generated by a heuristic cost function. The data for this application has many items with the same key and the minimum is the next item to be deleted. This paper presents an implementation of the bucket priority queue and discusses its performance.

James Rhodes, Elise de Doncker
Acceleration of Multiple Precision Solver for Ill-Conditioned Algebraic Equations with Lower Precision Eigensolver

There are some types of ill-conditioned algebraic equations that have difficulty in obtaining accurate roots and coefficients that must be expressed with a multiple precision floating-point number. When all their roots are simple, the problem solved via eigensolver (eigenvalue method) is well-conditioned if the corresponding companion matrix has its small condition number. However, directly solving them with Newton or simultaneous iteration methods (direct iterative methods) should be considered as ill-conditioned because of increasing density of its root distribution. Although a greater number of mantissa of floating-point arithmetic is necessary in the direct iterative method than eigenvalue method, the total computational costs cannot obviously be determined. In this study, we target Wilkinson’s example and Chebyshev quadrature problem as examples of ill-conditioned algebraic equations, and demonstrate some concrete numerical results to prove that the direct iterative method can perform better than standard eigensolver.

Tomonori Kouya

Open Access

Study of Galaxy Collisions and Thermodynamic Evolution of Gas Using the Exact Integration Scheme

Radiative cooling of the interstellar medium plays a vital role in the context of galaxy formation and evolution. On the other hand, the cooling time in the high-density regions involving star formation is much shorter than the dynamical time of the gas. In numerical simulations, it is challenging to solve physical phenomena coexisting on significantly different timescales, and it is known as the overcooling problem in the study of galaxy formation. Townsend (2009) has developed the Exact Integration (EI) scheme that provides a stable solution for the cooling term in the energy equation of astrophysical fluid dynamics, regardless of the size of the simulation time step. We apply the EI scheme to define the effective cooling time that accounts for the temperature dependence of the cooling rate and investigate the thermodynamic evolution of gas in colliding dark matter subhalos. The results show that the conventional cooling time always indicates a shorter than the effective cooling time derived by the EI scheme because it does not include the dependence of the cooling rate on temperature. Furthermore, we run three-dimensional galaxy collision simulations to examine the difference in thermodynamic evolution between the EI scheme and the conventional Crank–Nicholson method for solving the cooling equation. Comparing the results of the two simulations, we find that the EI scheme suppresses the rapid temperature decrease after galaxy collisions. Thus, the EI scheme indicates considerable potential for solving the overcooling problem in the study of galaxy formation.

Koki Otaki, Masao Mori
Regularization of Feynman 4-Loop Integrals with Numerical Integration and Extrapolation

In this paper we continue our recent work on evaluating numerical approximations for a set of 4-loop self-energy integrals required in the computation of higher orders in perturbation theory. The results are given by a Laurent expansion in the dimensional regularization parameter, $$\varepsilon ,$$ ε , where $$\varepsilon $$ ε is related to the space-time dimension as $$\nu = 4-2\varepsilon .$$ ν = 4 - 2 ε . Although the leading-order coefficients for the diagrams with massless internal lines are given in analytic form in the literature, we obtain them using a numerical approach and with modern computational techniques. In a similar manner, we derive results for diagrams with massive lines. The SIMD (Single Instruction, Multiple Data) nature of the computation of loop integrals based on composite lattice rules lends itself to an efficient GPU implementation. We further apply double exponential numerical integration layered over the message passing interface (MPI) parallel platform. Limits of integral sequences (as the regularization parameter tends to zero) are implemented numerically using linear and nonlinear extrapolation procedures. Numerical results are given to illustrate the versatility of the methods and show some robustness with regard to the selection of sequence parameters.

E. de Doncker, F. Yuasa
Acceleration of Matrix Multiplication Based on Triple-Double (TD), and Triple-Single (TS) Precision Arithmetic

In this study, we present the results obtained from the acceleration of multi-component-type multiple precision matrix multiplication using the Ozaki Scheme and OpenMP. We aim for triple-double (TD) and triple-single (TS) precision matrix multiplication on CPU and GPU using TD and TS arithmetic based on error-free transformation (EFT) arithmetic. Owing to these combined techniques, on CPU, our implemented multiple precision matrix multiplications were more than seven times faster than Strassen matrix multiplication with AVX2. Similar to using Ozaki scheme, on GPU, our implemented multiple precision matrix multiplications could not be accelerated. Furthermore, we report that our accelerated matrix multiplication can modify parallelization performance with OpenMP on CPU.

Taiga Utsugiri, Tomonori Kouya

International Workshop on Cities, Technologies and Planning (CTP 2022)

Frontmatter
Fragile Territories Around Cities: Analysis on Small Municipalities Within Functional Urban Areas

Although many disadvantaged areas are remote and isolated, the others are close to dense urban areas. Then, these peri-urban areas are both peripheral and fragile and, often, they are fringes from urban areas, where the well-known urban-rural mix occur.In recent years, the most fragile urban areas have experienced growing attention. Through the years policymakers have tried to face territorial disadvantages with the help of funds.Law 158/2017 is one example. The Law proposes twelve criteria to identify and then select small municipalities to support. In 2020 a research proposed to convert Law’s criteria into indicators. Combinations of indicators can identify different conditions of fragility. These combinations are useful starting point for further studies on small municipalities.The present work uses data from the mentioned research for focusing on peri-urban areas. The case study is composed of small municipalities within three Functional Urban Areas (FUA) in Central Italy: L’Aquila, Perugia and Terni. Then, the paper points out the relationship between indicators of fragility and peculiar features of these areas.

Chiara Di Dato, Alessandro Marucci
Assessing Coastal Urban Sprawl and the “Linear City Model” in the Mediterranean – The Corinthian Bay Example

Urban sprawl and tourism urbanization, as prevailing trends in the Mediterranean coast, result in particular forms of ‘linear’ urban development, stretching with deployment of low-density urban fabric along extended areas near the shoreline. The evolving ‘linear city model’ accounts for: land and marine environmental degradation; higher vulnerability to climate change; and unsustainable future pathways of coastal urban constellations. Assessment of this model of urban development in coastal zones can reveal distinct spatial and functional irregularities and properly guide policy remediation action. This work elaborates on the exploration and identification of the “linear city” concept in the Mediterranean by use of a methodological approach that integrates: high-resolution multi-temporal data for built-up areas and their GIS-enabled elaboration; spatial metrics for quantifying morphological as well as spatial peculiarities and qualities of this linear city type; Principal Component and Cluster Analysis, unveiling built environment typologies; and correlation analysis, illuminating functionality weaknesses by linking these typologies with urban variables, e.g. population density, accessibility to transport and urban facilities. Implementation of this approach on a Greek example – Corinthian Bay, Northern Peloponnese – witnesses the discrete spatial typologies and highlights the fragmented and rather unsustainable, in spatial and functional terms, linear urban pattern of this coastal urban area.

Apostolos Lagarias, Ioannis Zacharakis, Anastasia Stratigea
I Wish You Were Here. Designing a Geostorytelling Ecosystem for Enhancing the Small Heritages’ Experience

Slow tourism is a different, more conscious, and sustainable way of travelling, often targeting places secondary to cities of art and mainstream locations but rich in lesser-known social, intangible, and cultural heritage. While digital technologies have provided tools to make cultural places, institutions and GLAMs accessible, top-down and informative communication no longer seem sufficient to intercept both the more sophisticated public and the new generations whose information consumption is increasingly mediated by smartphones and the experience shared with peers through social networks. The paper proposes and discusses an exploration and a design prototype of a mobile geostorytelling ecosystem related to the “Prosecco hills” – Valdobbiadene e Conegliano in Treviso province, in northern Italy, recently designated a UNESCO heritage site – based on a bi-directional communication platform whose aim is, on the one hand, to allow minor realities – the “small heritages” – to be present efficiently and effectively on online and in-situ thanks to shared communication tools, on the other hand, to allow people to create georeferenced points of interest for other users through mechanisms of storytelling and emotional connection with the territory and the visitors.

Letizia Bollini, Chiara Facchini
Smart City and Industry 4.0
New Opportunities for Mobility Innovation

The manufacturing industry is undergoing profound changes, so much so that it recognizes a new phase, called Industry 4.0 both for the shape and structure of the supply chain, production of goods and energy transition. An example of this epochal change is the auto and boat motive sector; a sector that sees the increasingly marked electrification of its products, characterized in parallel by a percentage modification of materials and production methods (new materials, 3D print). The development of the smart city is closely connected with the phenomenon of Industry 4.0, both in terms of mutual capacity for innovation, integration and digital transition, which makes its effects felt as well as in production lines, supply chains, with tangible effects in the urban spatial distribution of goods and services. The goal of the paper is to investigate the effects of industry 4.0 on slow and ecological mobility and in new environments: protected areas (natural and semi-natural parks) and historical and cultural areas. The case studies are: Natural Park of Molentargius and Archaeological Park ok Nora in the metropolitan city of Cagliari (Sardinia, Italy) that represent a significative case of contamination lab (Luna Rossa team, Atena and Dicaar of University of Cagliari).

Ginevra Balletto, Giuseppe Borruso, Mara Ladu, Alessandra Milesi, Davide Tagliapietra, Luca Carboni
Digital Ecosystem and Landscape Design. The Stadium City of Cagliari, Sardinia (Italy)

Increasingly articulated and sensitive is the approach to issues of conservation and/or balanced exploitation of resources because of the value of environmental quality at the basis of fundamental rights. This is demonstrated by the successive regulatory instruments: Environmental Impact Assessment (EIA, 1969), Strategic Environmental Assessment (SEA, 2001), Single Environmental Text (2006 containing the rules on environmental protection and waste management) and Agenda 2030 (2015, 17 for Sustainable Development), aimed at avoiding-containing situations of degradation and risk to the quality of life, including future generations. Among the different methodological approaches, we must also consider the Landscape Assessment to be carried out before and, in an evolutionary continuum, after the realization of a work. This in order to identify interference and impacts on the territory (with descriptive-quantitative investigation of the elements characterizing its structure from a naturalistic, anthropic and historical-cultural point of view, and perceptive investigation of the visual impact) its relations, qualities and balances. Certainly, responding to the purposes of a complete and innovative landscape assessment appears to be the use, although not yet fully widespread, such as that of Open Source Tools, available without a commercial license, therefore within the reach of all: designers and evaluators. In this framework, the aim of the work is to evaluate the role of Open-Source Tools in the assessment of the visual, ante and post operam impact of important urban works that require continuous information-collective participation and therefore capable of fostering the creation of shared values.

Ginevra Balletto, Giuseppe Borruso, Giulia Tanda, Roberto Mura
Towards an Augmented Reality Application to Support Civil Defense in Visualizing the Susceptibility of Flooding Risk in Brazilian Urban Areas

This paper presents an augmented reality application for visualizing floods in urban environments, aiming to raise awareness people about occupied areas susceptible to flooding, based on historical occurrences. Environmental disasters by flooding have occurred frequently in different regions of Brazil, where urban areas grow fast with low Master Plan control and integrated use of flood risk maps. At the same time, Civil Defense have been challenged on strategies of how to convince the population to abandon their homes in an iminent flood disaster. In order to help and support these strategies, mainly to avoid occupations in susceptible areas that may be irregular or not, it was developed a mobile application named “Neocartografia". The main goal of the application is to show ordinary people the potential effect of flooding presented in real time and interacting with their own buildings. The idea of integrating the real image of the environment with the simulation of a certain depth of flooding, being set manually or related with hydrological and hydraulic modelling. The Civil Defense agents point the smartphone to the citizen’s home, through augmented reality and simulates in the real world the depth of the water according to some parameters (speed, depth and extent of the flood waves). The results earned with the first version of Neocartografia Application have been sufficiently good to reach the developers first goals, but still must be improved.

Gustavo Vargas de Andrade, Victor Luis Padilha, Adilson Vahldick, Francisco Henrique de Oliveira
Framework Proposal of Smart City Development in Developing Country, A Case Study - Vietnam

Smart City has been determined as the key solution for urbanization which is considered as the inevitable trend of urban areas. There are many cities aim for the goals to become Smart City, but mostly failed because of different challenges. There is a desire for Smart City strategy of developing countries which having cities that begin to significantly face with the difficulties of urbanization process. This study offered a revolutionary smart city approach and framework for implementation in developing countries considering Vietnam cities as the case studies. Based on the revision for critical factors and journey of Smart City development of successful smart cities globally, the study proposed groups of solutions and framework for building Smart City effectively towards the sustainability in developing countries.

Tu Anh Trinh, Thi Hanh An Le, Le Phuc Tam Do, Nguyen Hoai Pham, Thi Bich Nguyet Phan
Studying Urban Space from Textual Data: Toward a Methodological Protocol to Extract Geographic Knowledge from Real Estate Ads

Real estate ads are a rich source of information when studying social representation of residential space. However, extracting knowledge from them poses some methodological challenges namely in terms its spatial content. The use of techniques from artificial intelligence to find and extract knowledge and relationships from textual data improves the classical approaches of Natural Language Processing (NLP). This paper will first conceptualize what kind of information on urban space can be targeted in real estate ads. It will then propose an automated protocol based on artificial intelligence to extract named entities and relationships among them. The extracted information will finally be modeled as RDF graphs and queried through GeoSPARQL. First results will be proposed from the case study of real estate ads on the French Riviera, with a focus on toponymy. Perspectives of quantitative spatial analysis of the geolocated RDF models of real-estate ads will also be highlighted.

Alicia Blanchi, Giovanni Fusco, Karine Emsellem, Lucie Cadorel

International Workshop on Digital Sustainability and Circular Economy (DiSCE 2022)

Frontmatter
Transforming DIGROW into a Multi-attribute Digital Maturity Model. Formalization and Implementation of the Proposal

SMEs form the backbone of the economy of most countries all over the world. Increasing their performance is therefore essential to increase the well-being of peoples. The adoption of digital technologies is universally recognized an essential component to achieve this objective. Unfortunately, European SMEs are lagging behind in the transition to digital. The Digital Europe Programme has been started at the end of 2021 to speed up such a transition. As part of the Programme, a network of so-called European Digital Innovation Hubs (EDIHs) is under construction in order to lend a hand to SMEs in the process of increasing their digitalization awareness and capabilities. The availability of a digital maturity assessment tool would be of great help for EDIHs to accomplish their mission. This paper gives a contribution in such a direction. It starts from an already published framework suitable to assess the digital maturity level of SMEs, as well as the capabilities associated to each level that are necessary for promoting a digital enabled growth. Then, the framework is transformed into a qualitative multi-attribute digital maturity model and the latter is implemented by means of a freeware software. The proposed model returns transparent indications to SMEs about where they are on their digital transformation journey, together with eventual strengths and weaknesses. SMEs should take these indications into account before allocating the budget for future investments.

Paolino Di Felice, Gaetanino Paolone, Daniele Di Valerio, Francesco Pilotti, Matteo Sciamanna

International Workshop on Econometrics and Multidimensional Evaluation in Urban Environment (EMEUE 2022)

Frontmatter
A Methodological Approach for the Assessment of the Non-OSH Costs

Accidents at work represent, globally, a significant social and business cost for all production processes and working sectors. With a view to achieve the Sustainability Goals shared in the 2030 Agenda, it is necessary to design tools that can analyze and monitor the costs related to the accidents phenomenon, to support the Public Administration in defining effective strategies for the implementation of health and safety in the workplace and to reduce the costs related to the phenomenon of injuries. On the basis of an analysis of the literature referring to existing models used to calculate the costs of the accident phenomenon, this research proposes a methodological approach to calculate the costs incurred, both in case and in the absence of injury by the main stakeholders (worker, enterprise, state, society) with reference to four different scenarios. The methodological approach can be used to different product sectors and different geographical contexts and could be used to compare the effectiveness of different political strategies.

Maria Rosaria Guarini, Rossana Ranieri, Francesco Tajani, Pierluigi Morano, Francesco Sica
A Decision-Making Process for Circular Development of City-Port Ecosystem: The East Naples Case Study

The synergy between the city and the port area represents an essential component for the sustenance of the economic system of the European Union. In this perspective, ports have been recognised as critical strategic hubs in economic competitiveness, job opportunities and investments. Nowadays, ports as strategic hubs suggest an increase of spaces dedicated to logistics and a high economic, social and environmental impact on cities. The need to define a new process of circular development of the city-port ecosystem considers environmental and social challenges, proposing a circular urban model capable of renewing the relationship between city and port. Starting from the Sustainable Development Goals (SDGs) and Agenda 2030, and the instances of circular development, the research defines an adaptive and multidimensional decision-making process capable of integrating evaluation approaches and tools to deal with complex interactions and conflicts that characterise the city-port ecosystem. The outcomes were focused on evaluating alternative scenarios for the city-port ecosystem of East Naples, in Italy, towards a circular model implementation. The decision-making process supports an accurate reflection on the possibilities of regenerative transformation of a complex context, considering the specific needs of stakeholders and the local community.

Sabrina Sacco, Maria Cerreta
A Cost-Benefit Analysis for the Industrial Heritage Reuse: The Case of the Ex-Corradini Factory in Naples (Italy)

Cost-Benefit Analysis (CBA) is a method to evaluate a project or a policy, considering all costs and benefits to encourage a medium- and long-term vision. The CBA application fields are broad, counting as primary domains: transportation, environment, cultural heritage, energy, research and innovation, and information technology. This paper shows a Cost-Benefit Analysis for the reuse project of a former industrial plant in the city of Naples (Italy). The purpose is to identify the advantages and disadvantages of the cost-benefit approach in the case of industrial building reuse. The preliminary outcomes show that the following social externalities are likely to be subject to a certain risk degree: i) amount of CO2 emissions saved, ii) benefits for businesses (enterprises and start-ups), iii) benefits for researchers, young professionals and students, iv) impacts on the real estate market, v) technological readiness of the involved innovative processes. The authors highlight that specialists should also search for ways to generate acceptable shadow prices for short-run consequences using empirical data from different sources.

Marilisa Botte, Maria Cerreta, Pasquale De Toro, Eugenio Muccio, Francesca Nocca, Giuliano Poli, Sabrina Sacco
Unraveling the Role Played by Energy Rating Bands in Shaping Property Prices Using a Multi-criteria Optimization Approach: The Case Study of Padua’s Housing Market

As the topic of energy efficiency is still in the spotlight after a few decades of studies delving into it, several dozens of recent publications deal with the potential occurrence of a price premium for green buildings, namely, those significantly outperforming the best conventional ones. Almost all of those investigations find that a price premium indeed occurs and is statistically significant - though its magnitude is still debated - using regression analysis. Especially, the traditional hedonic price model is widely used in early studies, while the spatial autoregressive and spatial error models are ever extensively employed in the newest ones. Here we suggest using a different approach by turning to multi-criteria analysis. In particular, we propose combining an application of the analytical hierarchy process and linear optimization to identify the role played by energy rating bands and the energy performance index in shaping property prices. As a result, we expect to estimate the likely magnitude of the price premium for building energy efficiency. We test the approach on the local housing market in Padua, North-eastern Italy. A strong influence is found to be exerted on property prices by the energy performance as expressed by the rating bands. After discussing the empirical findings and the limitations of the analytical procedure, we also outline further developments for our multi-criteria approach.

Sergio Copiello, Edda Donati
Explicit and Implicit Weighting Schemes in Multi‐criteria Decision Support Systems: The Case of the National Innovative Housing Quality Program in Italy

While institutionalized and purely contractual public-private partnerships (PPPs) are supposed to be helpful to carry out urban regeneration interventions, many of such projects - especially in Italy - are developed under the framework of negotiation-based PPPs, also known as negotiating partnerships. The structuring process of negotiation-based PPPs extensively uses various valuation approaches and methods, with a remarkable role played by multi-criteria decision support systems. The literature focuses on using valuation approaches and methods in this field, pointing out the potentialities that can be exploited and the limitations that should be addressed. This paper places itself within this debate and tries to address an inherent issue in multi-criteria decision aid (MCDA) techniques. Here we show that there could be a gap between the explicit weighting system used in MCDA analysis and the implicit weighting system actually employed. The issue is discussed using the National Innovative Housing Quality program, lastly adopted in Italy, as a testbed. As a case study, we consider the program proposal defined by the municipality of Treviso, North-eastern Italy. Implicit weights are identified ex-post according to the allocation of funding to the intervention projects. While confirming a difference in comparison to the importance of the criteria identified a priori by the public body that governs the program, we also argue that the gap narrows if assuming that the project options match the criteria according to a nonlinear relationship. The originality and value of this paper lie in addressing a topic that is underestimated in the reference literature. Instead, its thorough consideration might help structure program proposals based on negotiating partnerships that are more effective.

Aurora Ballarini, Sergio Copiello, Edda Donati
Analysis of the Difference Between Asking Price and Selling Price in the Housing Market

In Italy, the opacity of the real estate market, which often does not reveal the real consistency of selling prices, or, in relation to the phase of the economic cycle, the low number of transactions force appraisers to use asking prices as comparables in the market approach. The international literature recognizes the importance of analyzing the relationship between asking prices and selling prices or time on market for the interpretation of the real estate market. In this work, the analysis of the difference is aimed at interpreting its variance by identifying the variables that have greater weight. For this purpose, a multivariate analysis model is built on a sample of data over a 12 years interval recorded in the housing market of the city of Potenza, Italy.

Benedetto Manganelli, Francesco Paolo Del Giudice, Debora Anelli
Spatial Statistical Model for the Analysis of Poverty in Italy According to Sustainable Development Goals

The “Sustainable Development Goals” indicate which changes nations and people of the world are committed to achieve, by virtue of a global consensus, obtained through a long, complex, and difficult path of dialogue and international and interdisciplinary collaboration. Ending poverty, in all its manifestations including its most extreme forms, through interconnected strategies, is the theme of Goal 1. Providing people all over the world with the support they need, as through promotion of social protection systems, is, in fact, the very essence of sustainable development. The objective of this work is the statistical analysis of the indicators useful for achieving the “No Poverty” Goal 1 through multidimensional statistical analysis methodologies (Totally fuzzy and relative) to understand which Italian regions need more government intervention.

Paola Perchinunno, Antonella Massari, Samuela L’Abbate, Lucia Mongelli
Real Estate Sales and “Customer Satisfaction”: Assessing Transparency of Market Advising

The paper start describing the change of relationship between the evolution of real estate market, housing services, and economic crisis due to the covid pandemic event.Looking more in detail housing and hosting services we can discover the evolution of concept of home-service as the reference point for a different number of activity.In the same time the housing services, considered as a facility for external city users changed their characters and relevance.The transition of urban costumes from the pre-pandemic era to the post pandemic ones generated new use values as a consequence of new costumes.The use of the networks for communicating and collaborating from home instead that from offices and clerk’s workplaces changed office attitudes, but in the same time reduced the usefulness provided by housing stocks addicted to guest foreign city user.The paper describes the evolution of the interpretation of role of housing services, and the consequent change of use-values, in a city that is characterized by the presence of many guests workers, that use housing services by renting flats that have been unused in the most recent years, in the pandemic era, that compelled renters to become travellers.

Carmelo Maria Torre, Debora Anelli, Felicia Di Liddo, Marco Locurcio
Backmatter
Metadaten
Titel
Computational Science and Its Applications – ICCSA 2022 Workshops
herausgegeben von
Prof. Dr. Osvaldo Gervasi
Beniamino Murgante
Sanjay Misra
Ana Maria A. C. Rocha
Dr. Chiara Garau
Copyright-Jahr
2022
Electronic ISBN
978-3-031-10562-3
Print ISBN
978-3-031-10561-6
DOI
https://doi.org/10.1007/978-3-031-10562-3