Skip to main content

2018 | Buch

Computational Science and Its Applications – ICCSA 2018

18th International Conference, Melbourne, VIC, Australia, July 2-5, 2018, Proceedings, Part V

herausgegeben von: Prof. Dr. Osvaldo Gervasi, Beniamino Murgante, Sanjay Misra, Elena Stankova, Prof. Dr. Carmelo M. Torre, Ana Maria A.C. Rocha, Prof. David Taniar, Bernady O. Apduhan, Prof. Eufemia Tarantino, Prof. Yeonseung Ryu

Verlag: Springer International Publishing

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

The five volume set LNCS 10960 until 10964 constitutes the refereed proceedings of the 18th International Conference on Computational Science and Its Applications, ICCSA 2018, held in Melbourne, Australia, in July 2018.
Apart from the general tracks, ICCSA 2018 also includes 34 international workshops in various areas of computational sciences, ranging from computational science technologies, to specific areas of computational sciences, such as computer graphics and virtual reality.
The total of 265 full papers and 10 short papers presented in the 5-volume proceedings set of ICCSA 2018, were carefully reviewed and selected from 892 submissions.
The paper Nitrogen Gas on Graphene: Pairwise Interaction Potentials is available open access under a Creative Commons Attribution 4.0 International License via link.springer.com.

Inhaltsverzeichnis

Frontmatter

Workshop Cities, Technologies and Planning (CTP 2018)

Frontmatter
Spatial Data Analysis and Evaluation by Urban Planners of a PPGIS Experiment Performed in Porto Alegre, Brazil

We analyze spatial data on the public perception collected in a Public Participation Geographic Information System (PPGIS) experiment performed in Porto Alegre, Brazil. Aiming to verify PPGIS capability to access inhabitants local knowledge and facilitate its incorporation into urban planning, using an exploratory analysis in the form of heat maps, we describe the spatial distribution of our variables and identify patterns. In addition, PPGIS was evaluated by urban planners in order to assess expert’s openness to the method. The results indicate that by collecting public perceptions in an automated, geo-referenced manner, PPGIS enables these data to be analyzed together with other information layers necessary for urban planning. Moreover, local knowledge collected with PPSIG can help improve the content quality of urban plans and/or projects in the opinion of the urban planner’s consulted. However, it is necessary to expand technical knowledge so that the data collected can be analyzed and incorporated in urban planning in a consistent way.

Geisa Bugs
A GIS-Based Method to Assess the Pedestrian Accessibility to the Railway Stations

Modern cities, affected by congestion, atmospheric and acoustic pollution, land consumption, should change and return to being on a human scale, rather than designed for cars. Proper land-use planning, infrastructure improvement and implementation of targeted interventions have to ensure that there is a rapprochement of citizens about the public transport, in order to reduce these problems. An important aspect of modal choice by users of the public transport system is, however, linked to the quality of the pedestrian routes. In fact, every journey made by public transport begins and finishes on foot. Therefore, studying pedestrian accessibility to transit stations and the walkability of the pedestrian environment is really important, in order to understand how they influence the user’s choices. So, in this paper, a GIS-based method, useful for the assessment of the quality of pedestrian paths and accessibility to stations, is proposed. The measurement of the quality of road segments is useful for the redesign of the spaces, the planning of interventions on the pedestrian routes and the setting of the intervention priorities, developing a decision support system. Different indicators, linked to the pleasantness of the route, to practicability and to safety, which have a different influence on the user’s choice, have been chosen for the quality of the pedestrian routes. As a case study, some stations of the railway link of Palermo (Belgio, Francia and San Lorenzo) have been chosen, in order to identify solutions to be proposed to the municipal administration.

Gabriele D’Orso, Marco Migliore
Urban Regeneration for a Sustainable and Resilient City: An Experimentation in Matera

In urban policies, Italian and European, the urban regeneration of residential districts, especially suburbs built in post-war urban expansions, has been a crucial question since the 1990s. An integrated approach is now essential for effective urban redevelopment programs, which consider not only architectural and urban, but also social, economic, naturalistic-environmental and cultural aspects, in order to return dignity, identity and centrality to the marginal areas that today express tangible and intangible forms of urban unease. The development in recent decades of innovative tools such as complex programs, social housing, therefore, marks the transition to a new way of urban planning, characterized by a different approach to urban and territorial policies, aimed at integrating a plurality of functions and typologies of intervention and that contemplates the possibility to involve private operators and private financial resources for the realization of public works. The paper presents a case study as a best practice approach to urban regeneration.

Piergiuseppe Pontrandolfi, Benedetto Manganelli
Internal Areas and Smart Tourism. Promoting Territories in Sardinia Island

The paper tackle the issue related to the promotion of internal areas of Sardinia, considered the contrast existing between a strong, peak coastal tourism and a lower development in the internal areas. Considered the vast amount of cultural and environmental goods present in the territory, particularly in internal areas, in this paper we promote a mixed qualitative – quantitative method for highlighting more promising areas for targeting ad hoc policies of development. We start with a hot spot analysis of cultural and environmental goods in Sardinia Region, with particular reference to the internal areas. We then proceed with highlighting the municipalities or aggregation of municipalities presenting the highest concentrations of such goods. Then we qualitatively evaluate those events or sites generally more renown and appreciated, observing if local – municipal or regional – policies have been activated for their promotion. Observed that, we suggest possible interventions to enhance the touristic offer of such part of the Sardinia island.

Silvia Battino, Ginevra Balletto, Giuseppe Borruso, Carlo Donato
Gentrification and Sport. Football Stadiums and Changes in the Urban Rent

In this paper we examine the changes in terms of urban rent and urban planning occurring after the introduction on the Italian law of 21 June 2017, n. 96 on Football stadiums property and management. Such law is actually paving the way to a set of new and still unexplored consequences on urban rents and urban renewal processes and real estate markets, as well as in terms of new patterns of urban behaviors. In detail, changes deal with the times strictly related to sport events, well scheduled in time (peak events), and those related to the ordinary life of the area (off peak events) as retail, transport and leisure/residential activities, often now coupled with the presence of such sport facilities. We briefly analyze some few Italian cases of football stadiums renewals, especially looking at those settled in cities hosting premier league clubs. We looked also at consequences they had in terms of urban rent, urban services. After, we started considering the possible implications that such investments can have on the cities that are likely to host such renewal processes in the near future, trying to highlight some possible changes in the “hedonic price” asset, and suggestions in terms of policy aimed at igniting a ‘good’ gentrification process.

Ginevra Balletto, Giuseppe Borruso, Francesco Tajani, Carmelo M. Torre
ALARP Approach for Risk Assessment of Civil Engineering Projects

Risk assessment is essential to express judgments of economic convenience on investment initiatives. This certainly applies to civil engineering projects, where the risk components are not only economic, but also environmental, social and cultural. Thus, the aim of the paper is to delineate a risk analysis model in the economic evaluation of investments through the development of algorithms where the Cost-Benefit Analysis (CBA) logic is integrated with the ALARP principle. The latter provides operative tools ensuring that risk is tolerable if it is “As Low As Reasonably Practicable”. The study shows that the ALARP logic, widely applied in sectors such as nuclear, energy and oil & gas, but less implemented in civil engineering, can instead become an important investigative tool if used jointly with the CBA precisely in the evaluation economic of civil projects, contributing to the characterization of efficient forecast protocols.In the first paragraph of the paper, the steps necessary to manage the risk connected to a project initiative are described and the ALARP logic is analysed. The second paragraph presents the risk analysis approaches traditionally used in the economic evaluation of projects. In the third section the logical scheme of an innovative protocol for the management of project risk is defined, by integrating the ALARP principles in the procedural scheme of the CBA. In conclusion, prospects for future research are outlined.

Gianluigi De Mare, Antonio Nesticò, Renato Benintendi, Gabriella Maselli
Smart Block EAN: Ten Scalable Initiatives for a Smart City

The accelerated urbanization is a global problem and smart cities are at the center of the debate on sustainability and urban development around the world. Transforming current cities into smart cities is a great challenge, since it requires, among other things, transcendental changes in public administration and in the social conscience of citizens. We propose a scale solution in which we apply the concept of Smart city to a smart block. This article presents the conception of the Smart Block EAN project and the development of its ten phases. Finally, we show the progress of the ten projects that make up the Smart Block EAN project and provide conclusions.

Alix E. Rojas, Camilo Mejía-Moncayo, Leonardo Rodríguez-Urrego
Spatial Indicators to Evaluate Urban Fragmentation in Basilicata Region

The increase of artificial land use represents a relevant indicator in land management policies and practices. It is a useful tool in assessing the quality of settlement processes and the protection and enhancement policies in rural and natural areas. Over time land take processes have been caused by different phenomena: urban or industrial expansion, realization of infrastructures, the development or the productive exploitation of territorial areas characterized by the presence of specific resources (natural, mining, etc.). This phenomenon is no longer a direct consequence of a real need of new expansion areas throughout Italian national territory. In the past the phenomenon was mainly due to residential, productive or tertiary sector needs, and it was generated by demographic growth and the consequent urbanization process. In the last two decades land take is more and more related to a weak territorial governance, generally linked to an inefficiency of urban and territorial planning instruments and sometimes of speculative real estate initiatives. In this paper a spatial analysis procedure oriented to calculate indicators of urban fragmentation for Basilicata Region has been presented. Such indicators could drive to the identification of two phenomena: urban-sprawl and urban-sprinkling according to the literature classification proposed in several researches by Romano et al. The results represent a useful contribution in order to improve regional normative system concerning urban development. The research is part of a wider project on environmental and territorial indicators (INDICARE) promoted by FARBAS (Environmental Observatory Foundation of Basilicata Region) in collaboration with the University of Basilicata.

Lucia Saganeiti, Angela Pilogallo, Francesco Scorza, Giovanni Mussuto, Beniamino Murgante
Increasing the Walkability Level Through a Participation Process

The paper analyses the theme of walkability in the western part of Potenza municipality. It is based on participatory process developed in Active Citizenship for Sustainable Development of the Territory (CAST) project. During this experience a cognitive framework has been defined both adopting traditional approaches, and, in order to increase the participation, using new information technologies and social networks. The data that emerged were revised and evaluated for the definition of possible strategies for the improvement of walkability, accessibility to the services and equipment and, more generally, the neighbourhood liveability.

Raffaella Carbone, Lucia Saganeiti, Francesco Scorza, Beniamino Murgante

Workshop Defense Technology and Security (DTS 2018)

Frontmatter
VM-CFI: Control-Flow Integrity for Virtual Machine Kernel Using Intel PT

Nowadays cloud computing technology is used for a variety of services, such as the internet of things and artificial intelligence. However, as more data is being processed in the cloud, there is growing concern about security issues in the cloud computing environment. To solve this concern, many studies have been conducted to ensure the integrity of virtual machines in a cloud computing environment. However, in the case of the control-flow integrity for the virtual machine, existing studies are not only necessary to modify the kernel code, but also cannot protect it efficiently. In this paper, we propose VM-CFI which efficiently protects the control-flow integrity of VM kernel without modification of VM kernel in cloud computing environment. For this purpose, VM-CFI utilizes Processor Trace (PT), a hardware feature that is recently supported by Intel architecture. According to the experimental results, VM-CFI incurs on average 4.2% overhead.

Donghyun Kwon, Jiwon Seo, Sehyun Baek, Giyeol Kim, Sunwoo Ahn, Yunheung Paek
Study on Classification of Defense Scientific and Technical Information in Korea

The Korean government enacted the Defense Industrial Technology Security Act in 2015 and announced the list of defense industrial technologies to be protected for national security. Accordingly, defense industry companies should establish the defense technology protection system and protect the defense industrial technology information designated by law. Currently the Korean government is trying to develop the classification system of defense scientific and technical information. In this paper, we study a classification system of defense scientific and technical information. To do this, we analyze the directives currently operated by the Ministry of National Defense of Korea and the classification system of scientific and technical information of the US Department of Defense. Then, we propose a classification system which is based on Korean system and adopts the US system to supplement the shortcomings.

Ara Hur, Yeonseung Ryu
A FPGA-Based Scheme for Protecting Weapon System Software Technology

Recently the Korean government has established a law to protect the defense industrial technology for the sake of national security and is trying to apply the anti-tamper methodology to weapon systems acquisition programs. The anti-tamper refers to the system engineering activity to protect critical information in systems from tampering and reverse engineering. Adversary’s malicious tampering can weaken our military advantage, shorten the expected combat life of our system, and erode our defense industrial technological competitiveness. In this paper, we introduce the overview of the anti-tampering techniques and suggest a software protection technique using FPGA which can be efficiently used for weapon systems.

Minhae Jang, Yeonseung Ryu, Hyunkyoo Park
Conceptualization of Defense Industrial Security in Relation to Protecting Defense Technologies

In order to protect the advancement of defense technology that has a tremendous effect on both the national security and the economy, the Republic of Korea established the Defense Technology Security Act in 2015. As the new enactment brought changes to the landscape of the defense industry and defense industrial security, a new examination of the concept of the defense industrial security now became necessary. Even after taking in a consideration of the undisclosed nature of defense industrial security research, and the fact that only the limited number of firms participates in the subject matter, scientific researches related to the topic have not been active. However, with the new enactment of the Defense Technology Security Act, it is necessary to expand the scope of security and to redefine the concept of defense industrial security. In this paper, we analyzed the research works on related technology protection policies and our environment of the defense industry in order to conceptualize defense industrial security. It is expected that the established concepts could provide a systematic way to protect the confidential and defense technology.

Heungsoon Park, Heejae Go, Jonghyeon Hwang
Secure Cluster-Wise Time Synchronization in IEEE802.15.4e Networks

In IEEE802.15.4e networks, all nodes need to keep high-precision synchronization, which can enable highly reliable and low-power wireless networks. Cluster-wise time synchronization is usually necessary in IEEE802.15.4e networks, which provide a common time among a cluster of nodes. However, it cannot survive malicious attacks in hostile environments. We propose a secure cluster-wise time synchronization for IEEE802.15.4e networks. This paper makes three contributions: First, we provide an in-depth security analysis of cluster-wise time synchronization in IEEE802.15.4e networks. Second, we propose a security countermeasure which includes an improved μTESLA broadcast authentication protocol and fault-tolerant time synchronization algorithm. The improved μTESLA broadcast authentication protocol adopts a packet-based key chain mechanism to resolve the conflict between the delay of disclosed keys and the length of key chain in the original μTESLA. The fault-tolerant time synchronization algorithm adopts a cluster-wise time synchronization model to guarantee an upper bound of time difference between normal nodes in a cluster. Finally, theoretical analysis and real experiment results validate the effectiveness and feasibility of the security countermeasure.

Wei Yang, Zhixiang Lai, JuanJuan Zheng, Yugen Yi, Yuanlong Cao
Defense Technology Security Education Status

Since Defense Technology Security Act was conducted on June 30, 2016, target organization having defense industry technology should have the organization and build a technical security system necessary for protection and management of defense technology, and educate their employee regarding to the law and how to protect their defense industry technology. Defense technology security education is conducted to raise awareness of defense technology security and nurture defense technology security experts. The defense industry allows employees to share the importance of defense technology and to feel the need for security. However, training on defense technology security experts is still in its infancy and needs improvement. This paper emphasizes the necessity of new security education by recognizing the problem of current defense technology security education and analyzing similar education. Also, it suggests improvement directions of defense technology security education through on-line education system and advanced curriculum presentation.

Hyoung-ryoul Cho
Multi-lateral Cybersecurity Cooperation for Military Forces in the Digital Transformation Era

Most organizations including armed forces continuously invested on security solutions such as security information and event management, next generation firewall and intrusion prevention systems to improve their cybersecurity capabilities. Even though most legacy monitoring tools can detect cyber threats, the warfighting and business information systems remain vulnerable to modified or unknown threats since those are evolved continuously in a covert manner.At the Seoul Defense Dialogue (SDD) Cyber Working Group (CWG) 2017, we discussed several agendas related to the civil-military cooperative effort those can improve cybersecurity capabilities with high-risk/high-payoff research in the digital transformation era. The essence is granting the cyber workforces access to appropriate data from various sources needed to assure mission completeness based on the current technological achievement.In various situations, military operations must be synchronized with kinetic forces by appropriate cyber information. For this purpose, the cyber situational awareness (SA) provides the risk mitigation and the resilience capability for the mission completeness. To improve SA capabilities, it is important to share data with big data and machine learning technology among civil-military and international cooperation especially for defensive cyber operations.In this paper we described the lessons learned from SDD CWG to improve the state of the cybersecurity and the cognitive technology for the improved cyber situational awareness.

Hyun Kyoo Park, Wootack Lee, Zinkyung Ha, Namhee Park
A Study on the Weapon System Software Reliability Prediction and Estimation Process at the Software Development Phase

The proportion of software in the weapon system gradually increases and the embedded software in weapon system is becoming an important factor in determining the performance of the weapon system. The defect in the software of the weapon system are those that are not found at the development phase. To minimize the cost of software defect correction, systematic software quality control is required from the development phase. To increase the reliability of the software, reliability prediction and reliability estimation should be performed in the software development phase. This paper suggests the process of software reliability and the result of comparison on the software quality metrics of the project that applied the reliability prediction and estimation model are presented. It will contribute to improving the quality of software by minimizing potential defects in the software development phase.

In Soo Ryu, Suk Jae Jeong
Decentralized Message Broker Federation Architecture with Multiple DHT Rings for High Survivability

A message broker is an essential component of messaging services that connect information providers and consumers, message service clients, to enable the integrated message network. To support large scale message clients with the contracted service level, multiple brokers have to collaborate and form a federated broker cluster in many cases. For example, Kafka is one of the most popular messaging systems that allows multiple brokers co-working together. ZooKeeper, the distributed cluster coordinator, manages cluster nodes and stores metadata of messages for Kafka. Even though ZooKeeper does the coordination job well in a simple way, its half-centralized coordinating methods make the overall system less capable in survivability. In some domains such as military warfare and embedded sensor networks, we may lose the primary coordinator or lose more than a half of the coordinator machines. In these cases, we cannot support the minimum survivability to maintain the message network. To address this limited survivability problem, we propose a decentralized message broker federation architecture with distributed hash table. In our proposed architecture design, the decentralized coordinator supports the DHT exchanges between brokers to manage metadata of distributed message partitions. We built a prototype of a message broker federation based on our proposed decentralized metadata coordinator design to show the feasibility in terms of practical application.

Minsub Kim, Minho Bae, Sangho Yeo, Gyudong Park, Sangyoon Oh
Method to Judge the Coupling State of the Resonator in the Q-Factor Measurement

The judgement of the coupling state of the cavity is the key of the Q-factor measurement in the microwave devices and communication system. A simple and real-time method to determine the coupling state of resonators has been proposed. The coupling states are determined by the graphical relationship between the resonance circles and the match point in Smith chart. The extraction of the coupling coefficient and unloaded quality factor can be performed directly and timely. The procedure is validated with experimental measurements of manufactured cavities. The proposed method could provide an effective way to solve parameters extraction problem for resonators.

Hai Zhang
An Efficient Transmission Approach for Information-Centric Based Wireless Body Area Networks

Wireless body area networks (WBANs) are the promising network infrastructure to achieve ubiquitous personal health management services. In WBANs, the biological sensors are attached to the human body to obtain health related information and the communication gateway sends the sensed data to the health data server periodically. The health service providers can use the data to provide specific treatment for each individual according to her personal condition. Despite many advantages, the TCP/IP protocols used by WBANs incur efficiency issue of the data transmission. In TCP/IP, the data and their storage location are coupled, which greatly affects data transmission performance of WBANs. Moreover, additional security layer is required to support data privacy and security in WBANs. In this paper, an Efficient Transmission Approach for Information-Centric based Wireless Body Area Network (ETA-ICWBAN) is proposed. The proposed ETA-ICWBAN classifies the data into different priorities (high, normal, low) and the communication gateway buffers the data in the designated queue. We design a hybrid queue management algorithm to balance the different priority data transmission with various network conditions. Simulation results show that the proposed approach is able to improve the data transmission performance for WBANs.

Longzhe Han, Yi Bu, Xin Song, Jia Zhao
Efficiencies in Binary Elliptic Curves

This paper discusses the choices of elliptic curve models available to the would-be implementer, and assists the decision as to which model to use by examining the links between security and efficiency. In early public key cryptography schemes, such as ElGamal and RSA, the use of finite fields over large prime numbers was prevalent, thus preventing the need for difficult and expensive computations over extension fields. Thus, with the introduction of elliptic curve models, the same computational infrastructure using prime fields was inevitably used. As it became clear that elliptic curve models were more efficient than their public key competitors, they acquired a great deal of attention. In more recent times, and with the onset of the Internet of Things, the cryptography community is faced with the challenge of improving the efficiency of cryptography even further, resulting in many papers dealing with improvements of computational efficiencies. This search, along with improvements in both software and hardware dealing with characteristic two fields has instigated the analysis of elliptic curve constructions over binary extension fields. In particular, the ability to identify an object in the field with a bit string aids computation for binary elliptic curves. These circumstances account for our focus on binary elliptic curve fields in this paper in which we present an in-depth discussion on their efficiency and security properties along with other relevant features of various binary elliptic curve models.

Scott T. E. Hirschfeld, Lynn M. Batten, Mohammed K. I. Amain

Workshop Geomatics for Resource Monitoring and Control (GRMC 2018)

Frontmatter
A Low Cost Methodology for Multispectral Image Classification

Multispectral and hyperspectral remote sensing have significantly improved territorial surveys and mapping. However aerial images are often expensive being acquired through aircraft and satellite sensors. Furthermore, the processing and classification of these images need commercial software that increases the entire cost of the analysis. For these reasons, we propose an approach of data acquisition and analysis based on supervised classification to obtain accurately maps of the area of interest in reduced time. The images have been acquired through 3-channels Tetracam ADC-Lite camera, and processed with free and open source software, PixelWrench2 and QGIS. The results obtained demonstrate that the approach can compete with traditional acquisition and classification methods, due to simple operational procedures, low operational costs, and high accuracy of supervised classification. This approach provides promising results that encourage its development and optimization of these technologies for other purposes, such as the mapping of asbestos-cement (AC) roof coverings.

Michele Mangiameli, Giuseppe Mussumeci, Alessio Candiano
Low-Altitude UAV-Borne Remote Sensing in Dunes Environment: Shoreline Monitoring and Coastal Resilience

UAV systems, fitted with either active or passive surveying sensors, can provide land-related measures and quantitative information with low costs and high resolution in both space and time. Such surveying systems can be quite valuable in defining geometrical and descriptive parameters in coastal systems, especially dune ecosystems. The present work is based on a survey of the dune system at the mouth of the Fiume Morto Nuovo in the San Rossore Estate (Pisa) and focuses on comparing LiDAR with UAV- and airplane-borne photogrammetry, as well as the respective 2D and 3D cartographic output, in order to assess topography changes along a stretch of coastline and to check their possible use in defining some ecological resilience features on coastal dune systems. Processing of survey data generates a Digital Surface Model (DSM) or Digital Terrain Model (DTM) and an orthophotograph, checked for accuracy and image resolution. Comparison of these products against those available in public access cartographical databases highlights differences and respective strengths.

Gabriella Caroti, Andrea Piemonte, Yari Pieracci
Calibration of CLAIR Model by Means of Sentinel-2 LAI Data for Analysing Wheat Crops Through Landsat-8 Surface Reflectance Data

This study proposes a method to calibrate the semi-empirical CLAIR model, a simplified reflectance model used to estimate the Leaf Area Index (LAI) from optical data, using Landsat-8 Operational Land Imager Surface Reflectance (OLISR) data over wheat cultivation areas.The procedure can be applied lacking both LAI field measurements and surface reflectance (SR) data by exploiting free of charge data, as the novel high-level Landsat8 OLISR and the Sentinel-2 LAI (S2 LAI) products. This last dataset was used as LAI reference at field size scale. Once calibrated, the model generates LAI information from OLISR data consistent with the S2 LAI. In this way it is possible merge the two products to obtain a finer temporal resolution LAI estimation during all the crop seasons.The method was tested and statistically assessed on three different wheat test fields located in the Capitanata area (Apulia region, Italy).

Giuseppe Peschechera, Umberto Fratino
Multi-image 3D Reconstruction: A Photogrammetric and Structure from Motion Comparative Analysis

Virtual Web Reconstruction of cultural heritage is one of the most interesting and innovative tool to preserve historical, architectural and artistic memory of many sites, particularly the ones prone to disappear, as well as to promote territories and tourism development. Recently, high-resolution 3D models are realized through improved technology and integration of survey techniques such as laser scanning and photogrammetry. However, the large and complex volume of 3D data makes difficult to access and handle them for either experts and citizens. In particular, the rendering of large 3D models may influence the performance of web publication and browsing. Considering this background, the goal of this paper is the comparison between the level of accuracy and realism of 3D models optimized using two different mesh simplification.The metric used is based on the Hausdorff distance which is a generic technique to define a distance between two nonempty sets, considering 3D scanner mesh as a reference in the measure. The “Casale di Pacciano” near Bisceglie (Apulia region, Italy), has been investigated as study case.

Grazia Caradonna, Eufemia Tarantino, Marco Scaioni, Benedetto Figorito
Investigation of a Flood Event Occurred on Lama Balice, in the Context of Hazard Map Evaluation in Karstic-Ephemeral Streams

In the context of flood risk assessment and urban territory protection, the proposed research is focused on the definition of flood hazard maps by using high-resolution Digital Terrain Models (DTMs) obtained by a Light Detection And Ranging [LiDAR], remote sensing technique. The hydrologic/hydraulic model was calibrated on a flood event occurred on June 2014 on Lama Balice, ephemeral stream located in Puglia (Southern Italy), using the water levels observed during field campaign. In particular the analysis was performed for the definition of hazard maps with return periods of 30, 200 and 500 years, exploiting a combined scheme of a mono/two dimensional flood propagation approach for the delineation of flooded areas. The conducted research gives a significant contribution for the assessment of techniques of dynamic hazard and risk evaluation, in order to support institutions (like Basin Authorities and Civil Protection agencies) and professionals, in the context of the application of recent European legislation on flood risk protection (Floods Directive 2007/60/EC) and for European programs of scientific research (as Horizon 2020) in ungauged karstic catchment.

Vito Iacobellis, Audrey M. N. Martellotta, Andrea Gioia, Davide Prato, Vincenzo Totaro, Rocco Bonelli, Gabriella Balacco, Alisa A. M. G. Esposito
Flood Susceptibility Evaluation on Ephemeral Streams of Southern Italy: A Case Study of Lama Balice

In the proposed work areas exposed to flood risk were evaluated in a particular context of karst ephemeral streams located in Puglia region (Southern Italy). The case study of Lama Balice, characterized by a natural geomorphologic structure, was tested for the application of a DTM-based approach, aimed to the rapid identification and mapping of flood risk. The inundated areas, obtained with a 2D hydraulic model, following design rainfall events characterized by different return periods, were used as reference maps for the selection of the most appropriate geomorphological descriptor exploiting the binary classifiers test. The performance of the adopted procedure was tested by validating the selected geomorphological descriptors on a different area with respect to that used for calibration, in order to estimate the discrepancy between DTM-based flood maps and those obtained by numerical simulation.

Andrea Gioia, Vincenzo Totaro, Rocco Bonelli, Alisa A. M. G. Esposito, Gabriella Balacco, Vito Iacobellis
Static and Kinematic Surveys Using GNSS Multi-constellation Receivers and GPS, GLONASS and Galileo Data

In this paper the results of static and kinematic surveys using GNSS multi-constellation receivers acquiring GPS, GLONASS and Galileo Open Service (OS) data are presented.The static and kinematic GNSS data processing have been performed using the open source program package RTKLIB. The kinematic surveys have been compared with a reference trajectory computed from a Mobile Mapping System (MMS) using a high performance GNSS/INS system. As reference stations a multi-constellation Leica receiver belonging to ItalPos network and located at University of Trieste, Italy and a reference station belonging to the Friuli Venezia Giulia Region Marussi GNSS network have been used for static and kinematic surveys.The obtained results show, even at this initial phase, good planimetric performances both in static than in kinematic applications. Two different static sessions performed in a six months period allowed the Authors to analyze the solution improvements due to the increased number of Galileo satellites.

Raffaela Cefalo, Antonio Novelli, Tatiana Sluga, Paolo Snider, Eufemia Tarantino, Agostino Tommasi
Geometric Accuracy Evaluation of Geospatial Data Using Low-Cost Sensors on Small UAVs

The recent development and proliferation of Unmanned Aircraft Systems (UASs) has made it possible to examine environmental processes and changes occurring at spatial and temporal scales that would be difficult or impossible to detect using conventional remote sensing platforms. However, new methodologies need to be codified in order to be compared with traditional photogrammetric products. This can be done by testing geometrical accuracies reached by the models when external orientations have changed. In this paper two dense point clouds, derived from the same spatial database, were compared to evaluate the discrepancies resulting from two different relative orientations: the first one based on the GPS position of each UAV frame and the second one based on GCPs, measured through GNSS positioning. The two dense point clouds presented an average offset of 3.4 cm and a standard deviation of 5 cm, proving that relative accuracy is only influenced by the matching intensity. To assign three different absolute orientations, the georeferencing procedure of the same orthomosaic was then verified based on GCPs coming from three different open geo-data sources. By evaluating the position discrepancies of some Independent Check Points (ICPs), the three open geo-data sources provided three estimates of different root-mean-square error (RMSE) positional accuracy, of which the absolute geometrical precision was the related function.

Mirko Saponaro, Eufemia Tarantino, Umberto Fratino
Fire Risk Estimation at Different Scales of Observations: An Overview of Satellite Based Methods

Since the mid-1980 s satellite remote sensing data have been used in forest fire monitoring for applications related to the diverse phases of fire management as, fire prevention, danger estimation, detection of active fires, estimation of fire effects (burned area mapping, fire severity estimation, smoke plumes, biomass losses, etc), post fire recovery, fire regime characterization, etc. Today satellite technologies can fruitfully support both research and operational activities for fire monitoring and management at different temporal and spatial scales and with cost effective tools. This paper provides a short overview of satellite remote sensing for forest fire danger estimation at different scale of observations.

Rosa Lasaponara, Angelo Aromando, Gianfranco Cardettini, Monica Proto

8th International Symposium on Software Quality (ISSQ 2018)

Frontmatter
Developer Focus: Lack of Impact on Maintainability

We were looking for evidence that a connection between source code quality erosion and the developer focus exists. We assumed that more focused developers, i.e. those who are concerned with a well specified part of the source code at a time are likely to commit higher quality code compared to those who are less focused, i.e. committing to various parts of the code. We estimated code quality with the ColumbusQM quality model and developer focus with structural scattering. Despite the assumption sounds quite logical, we could not find any supporting evidence.As structural scattering assigns a measure to a set of source files/classes (i.e., how close they are to each other in the package hierarchy), we could apply it in various ways. First, we defined developer focus to be the structural scattering of the set of source files in a commit to validate if more focused changes have better impact on maintainability than less focused ones. Second, we calculated the structural scattering of all the files the developer of a commit modified in the last 3 months and assigned this measure as the developer focus to that commit. With this test we checked if more focused developers tend to commit better quality code, compared to less focused ones. We also performed this test for every developer separately, considering only the subset of the commits that were created by that particular developer.We calculated the level of developer focus and the maintainability changes for every commit of three open-source and one proprietary software system. With the help of Wilcoxon rank test we compared the focus values of commits causing a maintainability increase with those of decreasing the maintainability. The results are non-conclusive, they do not even tend to the same direction, therefore we did not find any evidence of an existing connection between maintainability and developer focus. Therefore this is a publication of negative results.

Csaba Faragó, Péter Hegedűs
Software Reliability Assessment Using Machine Learning Technique

Software reliability is one of the major attributes in software quality assurance system. A large number of research works have been attempted in order to improve the reliability of the software. Research directions in improving software reliability may be defined in a three-step process i.e., software modeling, software measurement and software improvement. Each of these phases is equally important in obtaining reliable software system. It is important to achieve better accuracy in estimating reliability in order to manage the software quality. A number of metrics have been proposed in the literature to evaluating the reliability of a software. Machine learning approaches are found to be suitable ways in evaluating different parameters of software reliability. Several machine learning techniques have been evolved in order to capture the different characteristics of a software system. The machine learning algorithms like naive bayes, support vector regression, decision tree and random forest algorithms are found to be successful in classifying the bug data from data where feature sets are dependent with each other. In this paper, deep learning approach has been proposed to estimate the reliability of software. The proposed approach uses recurrent neural network for predicting the number of bugs or failure in software. Effectiveness of deep learning is extensively compared with the standard machine learning algorithms by considering the dataset collected from the literature.

Ranjan Kumar Behera, Suyash Shukla, Santanu Kumar Rath, Sanjay Misra
Quantitative Quality Assessment of Open Source Software by Considering New Features and Feature Improvements

Open Source Software (OSS) evolves through active participation of users in terms of requesting for features, i.e. new features (NFs) and improvements in existing features (IMPs). Fixing of these features results in generation of further features improvements. In this paper, we have proposed a mathematical model to embody the OSS development based on the rate at which IMPs are generated as a result of fixing of features (NFs and IMPs). We have validated the model for datasets of five products, namely Avro, Pig, Hive, jUDDI and Whirr of Apache open source project. Results show that the model exhibit significant goodness of fit in terms of MSE (Mean Square Error), Bias, Variation, RMSPE (Root Mean Square Prediction Error) and R2 performance measures.

Kamlesh Kumar Raghuvanshi, Meera Sharma, Abhishek Tandon, V. B. Singh
A Framework for Quality Measurement of BPMN Process Models

Nowadays organizations are collaborative and process-intensive. Measurement of process models have a variety of applications including process models’ quality evaluation, process improvement and task planning. The objective of this work is to propose a framework for quantifying and linking different quality characteristics of process models built with a standard process modeling language: BPMN. The framework supports the derivation of measures regarding internal quality characteristics of process models, to enable measurement of the quality perceived by process models’ users, at the end of the modeling process. To this aim, a measurement terminology for process modeling was set up in order to support process models’ measures instantiation. This was done through the specification of a set of activities to be developed in order to derive the base measures and indirect measures for prediction of process models’ quality. This work follows a product-oriented approach, by which, quality is assumed as being multi-dimensional concept, with several interrelated characteristics.

Anacleto Correia, António Gonçalves, Mário Simões-Marques
Feature Level Complexity and Coupling Analysis in 4GL Systems

Product metrics are widely used in the maintenance and evolution phase of software development to advise the development team about software quality. Although most of these metrics are defined for mainstream languages, several of them were adapted to fourth generation languages (4GL) as well. Usual concepts like size, complexity and coupling need to be re-interpreted and adapted to program elements defined by these languages. In this paper we take a further step in this process to address product line development in 4GL. Adopting product line architecture is a necessary step to handle challenges of a growing number of similar product variants. The product line adoption process itself is a tedious task where features of the product variants play crucial role. Features represent a higher level of abstraction that are cross-cutting to program elements of 4GL applications. We propose a set of metrics related to features by linking existing program elements to metrics and by relating features with each other. The focus of this study is on complexity and coupling metrics. We provide a metrics based analysis of several variants of a large scale industrial product line written in the Magic XPA 4GL language.

András Kicsi, Viktor Csuvik, László Vidács, Árpád Beszédes, Tibor Gyimóthy
A Case Study on Measuring the Size of Microservices

In cloud computing, the microservices has become the mostly used architectural style. However, there is still an ongoing debate about how big a microservice should be. In this case study, a monolith application is measured using Common Software Measurement International Consortium (COSMIC) Function Points. The same application is divided into pieces by following the Domain Driven Design (DDD) principles. The resulting cloud friendly microservices are measured again using COSMIC Function Points and the obtained results are compared.

Hulya Vural, Murat Koyuncu, Sanjay Misra
A Hands-on OpenStack Code Refactoring Experience Report

Nowadays, almost everyone uses some kind of cloud infrastructure. As clouds gaining more and more attention, it is now even more important to have stable and reliable cloud systems. Along with stability and reliability comes source code maintainability. Unfortunately, maintainability has no exact definition, there are several definitions both from users’ and developers’ perspective. In this paper, we analyzed two projects of OpenStack, the world’s leading open-source cloud system, using QualityGate, a static software analyzer which can help to determine the maintainability of software. During the analysis we found quality issues that could be fixed by refactoring the code. We have created 47 patches in this two OpenStack projects. We have also analyzed our patches with QualityGate to see whether they increase the maintainability of the system. We found that a single refactoring has a barely noticeable effect on the maintainability of the software, what is more, it can even decrease maintainability. But if we do refactorings regularly, their cumulative effect will probably increase the quality in the mid and long-term. We also experienced that our refactoring commits were very appreciated by the open-source community.

Gábor Antal, Alex Szarka, Péter Hegedűs
Teaching Database Design and Analysis in an Effective Way on Digital Platform and Its Effect on Society

Living in the era of Digital learning where the learners from different culture and diversity ought to be addressed. The quality of teaching that the learner receives matters the most, also that course contents needs to keep pace with the changing nature of organization especially in the IT discipline. The Objective of this paper is to teach database for IT organizations using Digital learning platform and keeping in mind that IT organizations are agile and lean. The paper discusses how the learning platform helps the organization to gear up its employees to keep updated with skills needed to handle DBMS design and analysis to meet the demands of IT sector.

Uma Maheswari Sadasivam, Chamundeswari Arumugam
Study of Various Classifiers for Identification and Classification of Non-functional Requirements

Identification of non-functional requirements in an early phase of software development process is crucial for creating a proper software design. These requirements are often neglected or given in too general forms. However, interviews and other sources of requirements often include important references also to non-functional requirements which are embedded in a bigger textual context. The non-functional requirements have to be extracted from these contexts and should be presented in a formulated and standardized way to support software design. The set of requirements extracted from their textual context have to be classified to formalize them. This task is to be accomplished manually but it can be very demanding and error-prone. Several attempts have been made to support identification and classification tasks using supervised and semi-supervised learning processes. These efforts have achieved remarkable results. Researchers were mainly focused on the performance of classification measured by precision and recall. However, creating a tool which can support business analysts with their requirements elicitation tasks, execution time is also an important factor which has to be taken into account. Knowing the performance and the results of benchmarks can help business analysts to choose a proper method for their classification tasks. Our study presented in this article focuses on both the comparison of performances of the classification processes and their execution time to support the choice among the methods.

László Tóth, László Vidács

Workshop Smart Factory Convergence (SFC 2018)

Frontmatter
A Hybrid Rule-Based and Fuzzy Logic Model to Diagnostic Financial Area for MSMEs

The importance of the micro, small and medium enterprises (MSMEs) in the regional and global economy causes the academic community to worry about its development and growth that is why several institutions publics and privates have advanced studies about problems of these organizations. We present a diagnostic model based on Rules which works together with the fuzzy logic. Our model allows to determine the possible diseases that suffer in financial matters as well as to determine their gravity and to offer alternative solutions.

Germán Méndez Giraldo, Eduyn López-Santana, Carlos Franco
A Novel Cloud-Fog Computing Network Architecture for Big-Data Applications in Smart Factory Environments

Research on the cloud system is increasing among the various ways to deal with the data generated by the smart factory. In this paper, we propose cloud-based fog computing network architecture that can be efficiently used to store, analyze, and utilize the accumulated data in smart factories. We build and evaluate system modeling and testbed based on the proposed architecture. As a result, the cloud developed in the OpenStack can be used for smart factory operation, analyzing various types of data, and checking the status of real-time processing through a dashboard, making it the most suitable structure for a smart factory environment require processing.

Dae Jun Ahn, Jongpil Jeong, Sunpyo Lee

Workshop Theoretical and Computational Chemistry and Its Applications (TCCA 2018)

Frontmatter
The ECTN Virtual Education Community Prosumer Model for Promoting and Assessing Chemical Knowledge

The dynamism of the learning economies is examined in order to single out the key factors allowing to promote knowledge dissemination and invention developments. The various steps involved in the production and usage of both tacit and explicit technological knowledge as common good are analysed in order to optimize its portability. The role played in this respect by business clusters (especially when adopting the prosumer model) dealing with knowledge is discussed with particular reference to chemical education in Higher Education Institutions. The adoption of the prosumer model for building a European system aimed at promoting and assessing chemical knowledge is examined. The particular case considered in the paper is the one born out of the activities of the Universities member of the European Chemistry Thematic Network association through its Virtual Education Community Committee and the operational support of the former spinoff of the University of Perugia Master-UP s.r.l. Results achieved during the first year of activity are discussed.

Antonio Laganà, Osvaldo Gervasi, Sergio Tasso, Damiano Perri, Francesco Franciosa
A Circular Economy Proposal on CO Reuse to Produce Methane Using Energy from Renewable Sources

The case of a cluster of companies and Public Institutions producing innovation in the field of storing energy obtained from renewable sources and investing in new circular production models is investigated. The role played by externalities in making the production cycle itself a social welfare by minimizing the deadweight losses is examined in detail in the light of the Klepper and Nordhaus models and by performing a Microeconomic Analysis. The study singles out the importance of strategically positioning industrial innovation in the stream of circular economy. As a specific case study the efforts played by Master UP srl in trying to drive such segment of the energy circular economy to success by developing an efficient and fruitful business for carbon neutral methane production is analyzed in terms of production isoquant curves.

Antonio Laganà, Lorenzo di Giorgio

Open Access

Nitrogen Gas on Graphene: Pairwise Interaction Potentials

We investigate different types of potential parameters for the graphene-nitrogen interaction. Interaction energies calculated at DFT level are fitted with the semi-emperical Improved Lennard-Jones potential. Both a pseudo-atom potential and a full atomistic potential are considered. Furthermore, we consider the influence of the electrostatic part on the parameters using different charge schemes found in the literature as well as optimizing the charges ourselves. We have obtained parameters for both the nitrogen dimer and the graphene-nitrogen system. For the former, the four-charges Cracknell scheme reproduces with high precision the CCSD(T) interaction energy as well as the experimental diffusion coefficient in both the pseudo-atom and full atomstic potential. In the second case, the atom-atom model provides an average interaction energy of 2.3 kcal/mol, comparable with the experimental graphene-$$\text {N}_{\text {2}}$$ interaction of 2.4 kcal/mol.

Jelle Vekeman, Noelia Faginas-Lago, Inmaculada G. Cuesta, José Sánchez-Marín, Alfredo Sánchez De Merás
Confinement of the Pentanitrogen Cation Inside Carbon Nanotubes

In recent years, the field of polynitrogen chemistry has seen a sparking activity, with new outstanding theoretical and experimental results. Polynitrogen clusters are excellent candidates for high-energy density materials, but their intrinsic instability poses great challenges for the synthesis and the subsequent storage. In this work, we explore by means of quantum chemical calculations the confinement of the pentanitrogen cation, N$$_5^+$$, inside carbon nanotubes of different diameters. The interaction of the two fragments is such that a charge transfer from the nanotube to the nitrogen cation occurs and leads to the subsequent decomposition of N$$_5^+$$, thus resulting in an overall unstable system. Nonetheless, preliminary results on the confinement of the neutral N$$_8$$ chain (as the product of an N$$_5^+$$ + N$$_3^-$$ addition) are presented, where it is shown that the encapsulation decreases the overall energy of the complex system. Two stable N$$_8$$ isomers are discussed and a first investigation on possible decomposition pathways is carried out.

Stefano Battaglia, Stefano Evangelisti, Thierry Leininger, Noelia Faginas-Lago
Potential Energy Surface for the Interaction of Helium with the Chiral Molecule Propylene Oxide

The discovery of propylene oxide in the interstellar medium has raised considerable interest about this molecule, which represents one of the simplest cases of chiral systems. In this paper, we present a quantum chemical study and a phenomenological approach, through the Pirani potential function, of the system He – propylene oxide in fourteen different configurations. Comparison of the optimized molecular structure at various level of theory, as well as a discussion on the two approaches is reported. The analytical form of the Pirani potential function permits future applications of classical simulations of molecular-beam collision experiments, especially to those related to chirality discrimination phenomena, in progress in our laboratory.

Patricia R. P. Barreto, Alessandra F. Albernaz, Vincenzo Aquilanti, Noelia Faginas-Lago, Gaia Grossi, Andrea Lombardi, Federico Palazzetti, Fernando Pirani
First-Principles Molecular Dynamics and Computed Rate Constants for the Series of OH-HX Reactions (X = H or the Halogens): Non-Arrhenius Kinetics, Stereodynamics and Quantum Tunnel

This paper is part of a series aiming at elucidating the mechanisms involved in the non-Arrhenius behavior of the four-body OH + HX (X = H, F,Cl, Br and I) reactions. These reactions are very important in atmospheric chemistry. Additionally, these four-body reactions are also of basic relevance for chemical kinetics. Their kinetics has manifested non-Arrhenius behavior: the experimental rate constants for the OH + HCl and OH + H2 reactions, when extended to low temperatures, show a concave curvature in the Arrhenius plot, a phenomenon designated as sub-Arrhenius behavior, while reactions with HBr and HI are considered as typical processes that exhibit negative temperature dependence of the rate constants (anti-Arrhenius behavior). From a theoretical point of view, these reactions have been studied in order to obtain the potential energy surface and to reproduce these complex rate constants using the Transition State Theory. Here, in order to understand the non-Arrhenius mechanism, we exploit recent information from ab initio molecular dynamics. For OH + HI and OH + HBr, the visualizations of rearrangements of bonds along trajectories has shown how molecular reorientation occurs in order that the reactants encounter a mutual angle of approach favorable for them to proceed to reaction. Besides the demonstration of the crucial role of stereodynamics, additional documentation was also provided on the interesting manifestation of the roaming phenomenon, both regarding the search for reactive configurations sterically favorable to reaction and the subsequent departure of products involving their vibrational excitation. Under moderate tunneling regime, the OH + H2 reaction was satisfactory described by deformed-Transition-State Theory. In the same reaction, the catalytic effect of water can be assessed by path integral molecular dynamics. For the OH + HCl reaction, the theoretical rate coefficients calculated with Bell tunneling correction were in good agreement with experimental data in the entire temperature range 200–2000 K, with minimal effort compared to much more elaborate treatments. Furthermore, the Born-Oppenheimer molecular dynamics simulation showed that the orientation process was less effective than for HBr and HI reactions, emphasizing the role of the quantum tunneling effect of penetration of an energy barrier in the reaction path along the potential energy surface. These results can shed light on the clarification of the different non-Arrhenius mechanisms involved in four-body reaction, providing rate constants and their temperature dependence of relevance for pure and applied chemical kinetics.

Nayara D. Coutinho, Vincenzo Aquilanti, Flávio O. Sanches-Neto, Eduardo C. Vaz, Valter H. Carvalho-Silva

Workshop Parallel and Distributed Data Mining (WPDM 2018)

Frontmatter
Parallel Mining of Correlated Heavy Hitters

We present a message-passing based parallel algorithm for mining Correlated Heavy Hitters from a two-dimensional data stream. To the best of our knowledge, this is the first parallel algorithm solving the problem. We show, through experimental results, that our algorithm provides very good scalability, whilst retaining the accuracy of its sequential counterpart.

Marco Pulimeno, Italo Epicoco, Massimo Cafaro, Catiuscia Melle, Giovanni Aloisio
An Innovative Framework for Supporting Frequent Pattern Mining Problems in IoT Environments

In the current era of big data, high volumes of a wide variety of data of different veracity can be easily generated or collected at a high velocity from rich sources of data include devices from the Internet of Things (IoT). Embedded in these big data are useful information and valuable knowledge. Hence, frequent pattern mining and its related research problem of association rule mining, which aim to discover implicit, previously unknown and potentially useful information and knowledge—in the form of sets of frequently co-occurring items or rules revealing relationships between these frequent sets—from these big data have drawn attention of many researchers. For instance, since introduction of the research problems of association rule mining or frequent pattern mining, numerous information system and engineering approaches have been developed. These include the development of serial algorithms, distributed and parallel algorithms, as well as MapReduce-based big data mining algorithms. These algorithms can be run in local computers, distributed and parallel environments, as well as clusters, grids and clouds. In this paper, we describe some of these algorithms and discuss how to mine frequent patterns or association rules in fogs—i.e., edges of the computing network.

Peter Braun, Alfredo Cuzzocrea, Carson K. Leung, Adam G. M. Pazdor, Syed K. Tanbeer, Giorgio Mario Grasso
An Innovative Architecture for Supporting Cyber-Physical Security Systems

Physical and cybersecurity is an important topic to be managed in complex communication systems. Specialized equipments and devices help human operators in managing security issues. The security manager of such system shall be capable to correctly decode all the incoming inputs from perceiving subsystems, that can be of different types, and that shall be aggregated and interpreted in the actual environments in order to generate alarms and countermeasures. This paper presents the design of a Decision and Control Unit (DCU) for such systems. The DCU aims at helping the security manager of the system to take informed decision in real-time reducing him/her workload. The general requirements, the architecture and the design of this decision support and decision making system are presented. An example of alarm detection is finally outlined.

Alfredo Cuzzocrea, Massimiliano Nolich, Walter Ukovich

Workshop Sustainability Performance Assessment: Models, Approaches and Applications Toward Interdisciplinary and Integrated Solutions (SPA 2018)

Frontmatter
Integrated SDSS for Environmental Risk Analysis in Sustainable Coastal Area Planning

The work deals with the development and implementation of a Spatial Decision Support System (SDSS) platform for coastal environmental risk analysis through the integration of multisource satellite data (Sentinel-1 and 2 and COSMOSkyMed) coupled with open source coastal hydrodynamic model addressed to flooding, erosion and pollution. The processing results allow us to cope with longshore pollutant dynamics connected to bathing use, to derive the shoreline changes and back-dune vegetation mapping, rocky coast movements detection as well as coastal area changes derived through advanced images segmentation techniques, multi-band change-detection and Persistent Scattered Interferometric Synthetic Aperture Radar technologies (PSInSAR). The SDSS provides cyclical production and updating in phase with satellite data acquisition frequency of the coastal scenarios for flooding risk analysis. All of these issues well enable operative products to be employed in the knowledge chain for sustainable coastal area planning activities. Moreover, self-consistent applicative tools, provided with proper graphical interface developed in IDL and integrated in SDSS, lead displaying and automatic extraction of the coastline sequence from Sentinel-1 data. Thus the comparison of two or more shorelines, even if multi-sources, provides the computation of coastal erosion and aggradation as well as the areas prone to coastal flooding. Finally, some interoperable tools for morpho-hydrodynamic modelling assimilation have been developed and implemented to reproduce flooding and pollution risk scenarios as well as coastal resilience assessment at different return time.

Michele Greco, Giovanni Martino, Annibale Guariglia, Lucia Trivigno, Vito Sansanelli, Angela Losurdo, Giovanni Mussuto
A Review of Residential Water Consumption Determinants

Water supply sectors are facing higher uncertainty in both resource availability and consumer demand. Future conservation programs require a full understanding of underlying factors of residential water consumption. However, previous studies have only considered one or several groups of factors without putting them all together in a bigger picture. This study was developed to provide a comprehensive view on these determinants and their relationships, as well as to discuss current gaps and possible directions. Determinants are categorized into six groups: (1) Economic; (2) Socio-demographic; (3) Physical properties; (4) Technological; (5) Climatic; and (6) Spatial drivers. All these determinants produce a very complex picture with many possible interrelationships. This nature, in one hand, poses challenges in selecting suitable technique to avoid autocorrelation, but on the other hand, provides chances to substitute unavailable important data with proxy variables. We have emphasized the lack of regional and cultural diversity in current studies, as most of them were carried out in developed and arid areas. Hence, a wider range of country specific and local-based studies is needed to better reflect the determinants and their relationships in diverse contexts. In future studies, a broader assessment scope taking into account effects such as feedback loop, spillover, and rebound should also be considered. In addition, these studies must deal with modern issues such as balancing between smart monitoring device utilization and consumer privacy.

Nguyen Bich-Ngoc, Jacques Teller
Carbon Stock as an Indicator for the Estimation of Anthropic Pressure on Territorial Components

Since the beginning of the industrial era, humans have been modifying the chemical composition and physical properties of the atmosphere, favoring an increase in the concentration of gases such as carbon dioxide (CO2), methane (CH4) and nitrogen oxide (N2O) well beyond the limits never previously exceeded [1]. If there is still uncertainty in the world and in some cases scepticism about the real extent of environmental or climate change, the increase in the concentration of these gases shows that humans are actually changing, heavily, the environment [2]. Growing disquiet of the scientific community about phenomena linked to climate modification and land use changes, to which they are often due or at least related, has led to the need to strengthen the levels of information and develop methodologies capable of constituting an adequate framework to support policies for territorial planning and land use transformation that can boast a holistic view of services and functions that are indispensable and/or desirable for human wellbeing. It is precisely in this context that this work is aimed at providing an estimate of the amount of carbon stored within the boundaries of the Basilicata region, no longer referring to it as an estimated quantity for its own sake, but as an assessment of a service provided by ecosystems for the regulation of the global climate that has gained increasing strength over the last 50 years [3].

Arianna Mazzariello, Angela Pilogallo, Francesco Scorza, Beniamino Murgante, Giuseppe Las Casas
Tourism Attractiveness: Main Components for a Spacial Appraisal of Major Destinations According with Ecosystem Services Approach

It is widely recognized that tourism sector plays an important role in territorial analysis both because of its economic and employment potential and its environmental and social implications. Investigating dynamics that influence tourist attractiveness and receptivity is fundamental in territorial planning sector in order to monitor regional and sustainable development policies, above all with a view to pursuing the 17 Sustainable Development Goals (SDGs) defined by the 2030 Agenda for Sustainable Development [1].Within the methodological framework established by the Millenium Ecosystem Assessment [2], the evaluation of tourism specialization degree was carried out following the ecosystem services approach and relying on inVEST, a spatially-explicit model developed with the aim of providing a decision support system able to quantify the set of goods and services profitable for human well-being and provided by ecosystems.The study area, located in the Southern Italy and including regional territories of Basilicata and Puglia regions, was chosen because in recent years it has been characterized by a steadily growing flow of arrivals and presence which a significant impetus was given by the designation of Matera as the Capital of European Culture for 2019. The work is completed by the contextualization of the results obtained within the framework of tourist attractiveness and accessibility of the examined area, which presents strong differences as much as it includes Basilicata, a region poorly infrastructuralized and only recently emerging in the tourism market, and Puglia considered instead a very sought-after Italian destinations.

Angela Pilogallo, Lucia Saganeiti, Francesco Scorza, Giuseppe Las Casas
Using Open Data and Open Tools in Defining Strategies for the Enhancement of Basilicata Region

Open data availability, participation and knowledge sharing are becoming increasingly important in planning processes aimed at protecting and enhancing the territory. This paper presents an application of Volunteered Geographic Information (VGI) for the creation of an open database for the enhancement of Basilicata region territory. The work was carried out during the Smart Basilicata training project and led to the definition of a map of the services of the regional territory, starting from open source tools and data available online and processed through geographic information systems.

Raffaella Carbone, Giovanni Fortunato, Giovanna Pace, Emanuele Pastore, Luciana Pietragalla, Lydia Postiglione, Francesco Scorza
From the UN New Urban Agenda to the Local Experiences of Urban Development: The Case of Potenza

The references that United Nations New Urban Agenda (UN HABITAT, 2015 and 2016, 2017) draw to the attention of disciplinary debate could represent a strong innovation for urban governance instruments and procedures. Especially in this period when European cities are defining the operational framework of development strategies 2014–2020 within the complex system of programming and management of European resources.Tools that the New Cohesion Policy proposes for the definition of local urban agendas are based on a procedural innovation that makes cities protagonists in a complex process of resources planning and managing (huge but never sufficient) for the regeneration and development of cities.In this work we propose an evaluation of the ITI of the Municipality of Potenza, approved in 2017, as it represents an approach that looks at strategic visions of urban development born in a context of fragility of local government and financial contingency for the municipal finances. A representative situation of a large number of medium-sized cities that are called upon to perform functions of ‘Managing Authority’ within the management of resources of Regional Operational Programmes without a structured process of technical/administrative ‘empowerment’ that defines skills, functions and scenarios of sustainability within an urban planning careful to the three principles: equity, efficiency, conservation of resources.In the proposed case study, a framework of isolated interventions emerges which implement parts of a strategy consistent with the guidelines of the urban planning instrument and which require an integrated vision of the main priority intervention areas of local context (mobility, urban parks, historical-cultural resources, services to citizens).

Giuseppe Las Casas, Francesco Scorza
The Role of Intermediate Territories for New Sustainable Planning and Governance Approaches. Criteria and Requirements for Determining Multi-municipal Dimension: South Italy Case

In a context like the Italian one, in which the pulverization of Municipalities, the hyper-territorialization and the inadequacy of the current administrative network are the cause of ineffective and inefficient public policies, the paper investigates the close relationship between institutional and economic-territorial policies, through a comprehensive re-reading of the system of the organization of local authorities in Italy aimed, on the basis of some reading and interpretation criteria adopted, to identify more relevant territorial morphologies to ensure more advanced and effective forms of representation and government. The aim is to experiment with a possible methodology for reading the territories able to respond to the need to adapt the territorial structure of local authorities to the new challenges of modernity and economic-productive innovation and to the rescaling induced by globalization. A contribution to the process, still in progress, on the rules and principles according to which municipalities should join in functional, areas or networks, able to govern territories and promote conditions of greater sustainability in local development processes (1).

Piergiuseppe Pontrandolfi, Antonella Cartolano
Investigating Good Practices for Low Carbon Development Perspectives in Basilicata

The Good Practices selected by the province of Potenza within the LOCARBO project, which are part of the Three Thematic Pillars of the project described below fully testify the experimentation of new cooperation practices in low-density contexts, where the results are positive when virtuous collaborations have been activated, linking energy policies objectives with citizens awareness, sustainable energy themes and new local entrepreneurial actions able to provide positive externalities from the economic point of view and the exploitation of local resources. This work re-classify the Good Practices presented in the LOCARBO project, according to different criteria constructed looking at the peculiar characteristics of the territorial context of reference. Low density contexts, implementation of integrated projects, inter-institutional cooperation models, collaborative approaches between institution and community and valorization of communities skills and local resources are the new criteria built after reading the local peculiarities, through to review projects and development strategies in Energy efficiency programs.

Alessandro Attolico, Rosalia Smaldone, Francesco Scorza, Emanuela De Marco, Angela Pilogallo
Backmatter
Metadaten
Titel
Computational Science and Its Applications – ICCSA 2018
herausgegeben von
Prof. Dr. Osvaldo Gervasi
Beniamino Murgante
Sanjay Misra
Elena Stankova
Prof. Dr. Carmelo M. Torre
Ana Maria A.C. Rocha
Prof. David Taniar
Bernady O. Apduhan
Prof. Eufemia Tarantino
Prof. Yeonseung Ryu
Copyright-Jahr
2018
Electronic ISBN
978-3-319-95174-4
Print ISBN
978-3-319-95173-7
DOI
https://doi.org/10.1007/978-3-319-95174-4