Skip to main content

Über dieses Buch

This book gathers various perspectives on modern map production. Its primary focus is on the new paradigm of “sharing and reuse,” which is based on decentralized, service-oriented access to spatial data sources.

Service-Oriented Mapping is one of the main paradigms used to embed big data and distributed sources in modern map production, without the need to own the sources. To be stable and reliable, this architecture requires specific frameworks, tools and procedures. In addition to the technological structures, organizational aspects and geographic information system (GIS) capabilities provide powerful tools to make modern geoinformation management successful.

Addressing a range of aspects, including the implementation of the semantic web in geoinformatics, using big data for geospatial visualization, standardization initiatives, and the European spatial data infrastructure, the book offers a comprehensive introduction to decentralized map production.




Exploring a New Paradigm in Map Production


Chapter 1. Changing Paradigm in Map Production and Geoinformation Management—An Introduction

Maps influence our daily life, either by being an interface that we consult, or in form of precise geospatial information in the background that influences our daily devices, such as the smartphone or car. The access to maps evolves into basic rights that e.g. is realized with volunteered geographic information. The concepts of processing geospatial data, creating geospatial information and transmitting geospatial knowledge make use of service-oriented architectures. The paradigm seems to change from a “collecting—assembling—storing” to a “reusing—assembling—sharing” methodology. This chapter explores the new paradigm in map production and geoinformation management, highlights its relevance and discusses the main important requirements that need to be considered.
Markus Jobst, Georg Gartner

Chapter 2. Service-Oriented Processing and Analysis of Massive Point Clouds in Geoinformation Management

Today, landscapes, cities, and infrastructure networks are commonly captured at regular intervals using LiDAR or image-based remote sensing technologies. The resulting point clouds, representing digital snapshots of the reality, are used for a growing number of applications, such as urban development, environmental monitoring, and disaster management. Multi-temporal point clouds, i.e., 4D point clouds, result from scanning the same site at different points in time and open up new ways to automate common geoinformation management workflows, e.g., updating and maintaining existing geodata such as models of terrain, infrastructure, building, and vegetation. However, existing GIS are often limited by processing strategies and storage capabilities that generally do not scale for massive point clouds containing several terabytes of data. We demonstrate and discuss techniques to manage, process, analyze, and provide large-scale, distributed 4D point clouds. All techniques have been implemented in a system that follows service-oriented design principles, thus, maximizing its interoperability and allowing for a seamless integration into existing workflows and systems. A modular service-oriented processing pipeline is presented that uses out-of-core and GPU-based processing approaches to efficiently handle massive 4D point clouds and to reduce processing times significantly. With respect to the provision of analysis results, we present web-based visualization techniques that apply real-time rendering algorithms and suitable interaction metaphors. Hence, users can explore, inspect, and analyze arbitrary large and dense point clouds. The approach is evaluated based on several real-world applications and datasets featuring different densities and characteristics. Results show that it enables the management, processing, analysis, and distribution of massive 4D point clouds as required by a growing number of applications and systems.
Sören Discher, Rico Richter, Matthias Trapp, Jürgen Döllner

Chapter 3. Establishing Common Ground Through INSPIRE: The Legally-Driven European Spatial Data Infrastructure

Back in the 1990s, there were several barriers for accessing and using the spatial data and information necessary for environmental management and policy making in Europe. These included different data policies, encodings, formats and semantics, to name a few. Data was collected for, and applied to, domain specific use cases and comprehensive standards did not exist, all impacting on the re-usability of such public sector data. To release the potential of spatial data held by public authorities and improve evidence-based environmental policy making, action was needed at all levels (Local, Regional, National, European) to introduce more effective data and information management and to make data available for citizens’ interest. The INSPIRE Directive, the Infrastructure for Spatial Information in Europe, directly addresses this set of problems. The Directive came into force on 15 May 2007, with full implementation in every EU Member State required by 2021. It combines both, a legal and a technical framework for the EU Member States, to make relevant spatial data accessible and reused. Specifically, this has meant making data discoverable and interoperable through a common set of standards, data models and Internet services. The Directive’s data scope covers 34 themes of cross-sector relevance as a decentralised infrastructure where data remains at the place it can be best maintained. A great deal of experience has been gained by public administrations through its implementation. Due to its complexity and wide scope, this is taking place in a stepwise manner, with benefits already emerging as important deadlines approached. Efficient and effective coordination are following the participatory approach established in its design. It is timely to reflect on 10 years of progress of the “cultural change” which the European Spatial Data Infrastructure represents. We therefore, consider the lessons INSPIRE is offering for those interested in joined-up and federated approaches to geospatial data-sharing and semantic interoperability across borders and sectors. The approach itself is evolving through this experience.
Vlado Cetl, Robert Tomas, Alexander Kotsev, Vanda Nunes de Lima, Robin S. Smith, Markus Jobst

Chapter 4. Evaluating the Efficiency of Various Styles of Distributed Geoprocessing Chains for Visualising 3D Context Aware Wildfire Scenes from Streaming Data

Big data refers to the ever-increasing volumes of data being generated continuously by a large variety of sensors, such as smartphones and satellites. In this chapter, we explore solutions for challenges related to the velocity characteristic of big geospatial data. The research was inspired by the Advanced Fire Information System (AFIS), which provides near real-time fire detection from earth observation satellites. Users in Southern and East Africa, South America and Europe are automatically notified of active fire detections, with the hope that timeous information may help to mitigate the impact of wildfires. This chapter evaluates the efficiency of various styles of geoprocessing chains for generating enriched notifications containing 3D fire visualisations from an intermittent stream of active fire detection data generated by remote sensing satellites. The visualisation should be ready for viewing as soon as the user investigates the notification; this implies the requirement for rapid geoprocessing, since there may be hundreds of fire detections disseminated to hundreds of parties at any satellite overpass. Geoprocessing chains were implemented in Python using open-source libraries and frameworks. This study investigated efficiencies across three dimensions: (1) software libraries, (2) tightly-coupled/serial versus loosely-coupled/distributed geoprocessing chain implementations, and (3) standardised geoprocessing web service (Web Processing Service) implementations versus bespoke software solutions. Results show that bespoke software, using specific geoprocessing libraries, implemented on a loosely-coupled messaging architecture significantly outperforms other combinations.
Lauren Hankel, Graeme McFerren, Serena Coetzee

Chapter 5. Service-Oriented Map Production Environments: The Implementation of InstaMaps

To overcome paper map limitations and promote the use of its own digital cartography the ICGC (Institut Cartogràfic i Geològic de Catalunya, the national mapping agency) created, developed and maintains a SaaS (Software as a Service) open platform starting from 2013 that allows anyone to combine all sorts of geodata and publish it online as a new (thematic) map.
Rafael Roset, Marc Torres, Wladimir Szczerban, Jessica Sena, Victor Pascual, Montse Ortega, Isaac Besora, Sergio Anguita

Chapter 6. Depiction of Multivariate Data Through Flow Maps

Flow maps are graphical representations that depict the movement of a geospatial phenomenon, e.g. migration and trade flows, from one location to another. These maps depict univariate spatial origin-destination datasets, with flows represented as lines and quantified by their width. One main feature of these maps is the aggregation of flows that share the same origin. Thus, visual clutter is reduced and the readability improved. This chapter describes a novel technique that extends flow maps to visualize multivariate geographical data. Instead of a univariate color scheme, we interpolate strategic color schemes to communicate multivariate quantitative information. Additionally, our approach crystallizes on augmenting flows with pie charts. To evaluate the relevance and impact of our approach, three case studies are presented.
Alberto Debiasi, Bruno Simões, Raffaele De Amicis

Chapter 7. Humanitarian Demining and the Cloud: Demining in Afghanistan and the Western Sahara

Communities in war torn countries continue to face many life threatening situations long after the end of a war. These situations include the contamination of the environment by landmines and explosive remnants of war (ERW). One of the main objectives of mine action is to address problems faced by communities owing to landmine contamination. Since the removal of all landmines worldwide is improbable, the humanitarian demining sector focusses on removing landmines from areas where communities are most affected. Due to the decrease in donor funding, there is continued pressure for more effective and efficient mine action through improved and appropriate data collection, analysis and the use of the latest technologies. Proper data management, sharing of data in the collaborative cloud and improved decision support systems to prioritize areas for demining, will result in more effective mine action. This chapter will discuss humanitarian demining as one of the components of mine action and will emphasize the importance of mapping an area for demining purposes. The importance of data management for decision support systems to prioritize areas for demining is covered with specific reference to data collection, manipulation, dissemination and data quality. The important role that the collaborative cloud plays in data dissemination and sharing is expanded upon. Use cases of the collaborative cloud and humanitarian mapping are noted and the role of data security is described. The latest decision support systems for humanitarian mapping are briefly discussed. The main shortcoming of these decision support systems is the lack of a spatial analysis component. The development of a decision support tool based on a Geographical Information System is illustrated by referring to case studies in Afghanistan and Western Sahara. The successful use of the GIS based decision support system has consequently lead to the development of a spatial multi-criteria analysis tool. The spatial multi-criteria analysis tool emphasizes the importance of sharing data in the collaborative cloud and the use of quality data. This tool contributed to humanitarian demining and mapping by assisting in better and faster decision making processes at a reduced cost.
Peter Schmitz, Johannes Heymans, Arie Claassens, Sanet Carow, Claire Knight

Chapter 8. Modern Requirement Framework for Geospatial Information in Security and Government Services

Appropriate requirement management for geospatial information is a very dedicate and sensitive task, because this phase majorly influences the success of the geospatial service delivery. During the requirement design the special needs of the service target group needs to be comprehended and translated into concrete map layouts, geospatial data content or topology and symbols, which should provide the necessary information for the end user. Especially the distinction between strategic (what?), operational (how?) and the lower tactical levels are useful differentiations during the requirement design, especially for hierarchical structured organizations. In addition, at the management level an optimal composition between time—quality—cost needs to be defined for the project. Main constraints for modern mapping activities are the short time-to-deliver requirements, completeness, accuracy and the possibility to easily share and comprehend the map. Rapid mapping supports a fast map production on one hand, geospatial intelligence collects relevant information for security actions on the other hand. This contribution focuses on the requirements of map production for security and government services, such as combined civil-military humanitarian relief missions.
Friedrich Teichmann

Importance and Impact of the New Map Production Paradigm


Chapter 9. Developing a Statistical Geospatial Framework for the European Statistical System

National Statistical Institutes, NSIs, have recognised the need for a better consideration of the geospatial dimension in the collection, production and dissemination of statistics for more than a decade. There is now general consensus in the European statistical community on the best methodology for linking statistics to a location through the exact geocoding of statistical units—persons, buildings, enterprises and households—to a point-based geocoding infrastructure for statistics based on addresses. In these terms the map plays an important role for visualisation and quality control. However the data used and the actual processes to integrate geospatial information management and geocoding of statistics into statistical production can vary substantially between countries and as a result geospatial statistics products have not entered the main stream of official statistics yet and often lack comparability between countries. The recent adoption of the Statistical Geospatial Framework (SGF) by the Committee of Experts for United Nations-Global Geospatial Information Management (UN-GGIM) offers for the first time a consistent framework to harmonise statistical-geospatial data integration and geospatial data management for statistics internationally. If fully developed and adopted by all NSIs, the Statistical Geospatial Framework should ensure high quality geospatial statistics. The next round of population Censuses in 2021 will generate a vast amount of statistical information on our societies with unprecedented spatial resolution. At the same time recent European and Global sustainable development programs will demand spatially disaggregated indicators. Specifically the indicator framework for the 2030 Agenda on sustainable development offers a unique opportunity to demonstrate the power of statistical-geospatial data integration across a wide range of statistical domains. To better structure these efforts and fit them into a framework that will result in consistent and comparable geospatial statistics the European Statistical System (ESS) will partner with National Mapping and Cadastral Agencies for the development of the ESS-SGF. The actual development is carried out via the GEOSTAT 3 project funded by Eurostat. Special attention will be paid to the full interoperability with statistical and geospatial data infrastructures and process models such as SDMX, INSPIRE and the Generic Statistical Business Process Model (GSBPM).
Marie Haldorson, Jerker Moström

Chapter 10. Vizzuality: Designing Purposeful Maps to Achieve SDGs

The amount of data readily available for people to access has been increasing rapidly over the last decades, due to the cumulative effects of the open data movement, the huge growth of cheap cloud storage, and broader access to the internet through smartphones. This poses a double-edged sword: we know more about our world than ever before, but making sense of all this data and communicating that to a general audience can be a big challenge. Vizzuality have been working with research organisations, universities, Non Government Organizations (NGOs) and governments across the world to develop interactive, understandable and usable data tools. Through a combination of maps, tables, charts and other data visualizations, they’ve helped these organisations communicate cutting-edge research on sustainable development to policy-makers, analysts, the media and the general public. From deforestation to government transparency and emissions from agriculture, this data can help these people make decisions to create a more sustainable world. This chapter will outline the techniques used by Vizzuality to understand what data needs to be communicated, how, and with whom, for a certain change to take place. The interdisciplinary approach, encompassing psychology, design, data science and software development, will be revealed through three case studies of Vizzuality’s products.
Sergio Estella, Jamie Gibson

Chapter 11. Geospatial Data Mining and Analytics for Real-Estate Applications

Market information on housing is relevant for good decision making of households as well as for real estate economics and is also of systemic interest. The official statistics often provide only annual indices on national level based on transactions of previous year. The stakeholders of the real estate markets however request analysis of higher spatial resolution on local level based recent transactions. This paper focuses on methodology of automated data acquisition, analysis and visualization of data from the housing market. Data retrieved from observing real estate markets (to rent/to buy) are used for statistical modelling of value-descriptive parameters, for estimating and forecasting real estate market fundamentals, which also can reveal market risks. This paper elaborates methods of automated data acquisition based on web mining, reviews methods of econometric spatial modelling with impact from proximity in order to deduct efficiency parameters within different categories. The analysis and the resulting visualization at different granularity of spatial, temporal and typological effects provides relevant information for decision making.
Gerhard Muggenhuber

Chapter 12. SDI Evolution and Map Production

Cartography and spatial data infrastructures (SDIs) are mutually dependent on each other: SDIs provide geographic information for maps; and maps help to understand geographic information provided by an SDI. SDIs emerged in the 1990s when paper maps were replaced by map production from digital geographic information. They evolved from government-funded initiatives within the geographic information science (GISc) community to SDIs with a variety of data producers and an audience much beyond the relatively small GISc community. SDIs continue to evolve in response to new technologies and developments. This chapter traces SDI evolution over the last three decades and considers how SDI technical developments have impacted map production and how they are likely to impact map production in the future.
Serena Coetzee

Chapter 13. Handling Continuous Streams for Meteorological Mapping

Providing Weather and Climate Data Service need complex operating IT systems and infrastructures as well as 24/7 operation running software systems. Meteorological data streams used in both, in the field of sciences and for operational tasks, e.g. in national meteorological weather services. Application for weather services and products have a strong cross domain impact, hereby interoperability, standard conformal data use are essential for everyday tasks. Three use cases will be shown: (1) Approach to proper data management, apply dynamic data citation, (2) services for numerical weather prediction and (3) trajectories.
Chris Schubert, Harald Bamberger

Chapter 14. Datacubes: Towards Space/Time Analysis-Ready Data

Datacubes form an emerging paradigm in the quest for providing EO data ready for spatial and temporal analysis; this concept, which generalizes the concept of seamless maps from 2-D to n-D, is based on preprocessing incoming data so as to integrate all data form one sensor into one logical array, say 3-D x/y/t for image timeseries or 4-D x/y/z/t for weather forecasts. This enables spatial analysis (both horizontally and vertically) and multi-temporal analysis simultaneously. Adequate service interfaces enable “shipping code to the data” to avoid excessive data transport. In standardization, datacubes belong to the category of coverages as established by ISO and OGC. In this contribution we present the OGC datacube data and service model, the Coverage Implementation Schema (CIS) and the Web Coverage Service (WCS) with its datacube analytics language, Web Coverage Processing Service (WCPS) and put them in context with further related standards is provided. Finally, we discuss architectural details of datacube services by way of operational tool examples.
Peter Baumann, Dimitar Misev, Vlad Merticariu, Bang Pham Huu

Requirements of the New Map Production Paradigm


Chapter 15. How Standards Help the Geospatial Industry Keep Pace with Advancing Technology

This chapter describes how, beginning in the 1990s, the information technology (IT) industry moved into a new paradigm based on widespread networking of computers. This new paradigm forced map production and geoinformation management system providers and users to adapt accordingly. Widespread networking of computers brings the opportunity for widespread discovery, sharing and integration of data. This cannot happen, however, without widespread agreement on standard protocols, data models, encodings and best practices. The The inherent complexity of geospatial information (Longley et al. 2015) imposes particular interoperability problems that can only be resolved through a geo-focused standards development process. The Open Geospatial Consortium (OGC) emerged to lead this progress, coordinating with many other organizations. It appears that the geospatial standards development process will be ongoing as long as the underlying information technology platform continues to evolve and new spatial technology user domains emerge and evolve.
Lance McKee, Joan Masó

Chapter 16. Standards—Making Geographic Information Discoverable, Accessible and Usable for Modern Cartography

Cartography relies on data. Today, data is generated in unprecedented volumes, velocity, and variety. As a result, cartographers need ever more assistance in finding appropriate data for their maps and in harmonizing heterogeneous data before functional maps can be produced. Spatial data infrastructures (SDIs) provide the fundamental facilities, services and systems for finding data. Implementing standards for geographic information and services plays a significant role in facilitating harmonization and interoperability in an SDI. This chapter reviews collaboration between standards development organizations in the field of geographic information, and describes resources available for a model-driven approach in the implementation of geographic information standards. Subsequently, good practice examples from Canada, Denmark, Japan, and Europe illustrate how standards implementation facilitates harmonization and interoperability, and how SDIs make geographic information discoverable, accessible, and usable for modern cartography.
Serena Coetzee, Reese Plews, Jean Brodeur, Jan Hjelmager, Andrew Jones, Knut Jetlund, Roland Grillmayer, Christina Wasström

Chapter 17. From Maps to Apps—Entering Another Technological Dimension Within the Geological Survey of Austria

The creation of geological maps and the provision of geological information has a long history at the Geological Survey of Austria (GBA). Geological maps enable the visualization of geoscientific topics of interest to the public and the authorities. As early as the middle of the nineteenth century the Geological Survey of Austria started producing geological maps. From these first individual sketch sheets and the first data sets to digital information provided via web services, information technology now leads us into the world of controlled vocabularies, such as the Thesaurus of the GBA. In the following, the path from individual geological manuscripts via the digitization process to the data administration for apps and semantic web capabilities at the GBA will be described.
Christine Hörfarter, Johannes Reischer, Martin Schiegl

Chapter 18. Statistical Disclosure Control in Geospatial Data: The 2021 EU Census Example

This chapter outlines challenges and modern approaches in statistical disclosure control of official high-resolution population data on the example of the EU census rounds 2011 and 2021, where a particular focus is on the European 1 km grid outputs derived from these censuses. After a general introduction to the topic and experiences from 2011, the recommended protection methods for geospatial data in the planned 2021 census 1 km grids are discussed in detail.
Fabian Bach

Chapter 19. Supply Chains and Decentralized Map Productions

Map production is a process in which data is collected and production materials are sourced from suppliers; transformed into maps and made available to a customer base by the mapping entity. This production process is a basic supply chain namely there are suppliers that provide the data as well as the production material; the mapping entity is the firm that produces the maps; and the customers of the mapping entity obtains the maps. Maps are available as digital maps that can be viewed on a computer, tablet or on a smart phone or as a paper map. The production process can be totally in-house or it can be decentralized, meaning various entities may collect the data, another entity can collate the data and make it publication ready and other entities can distribute the maps digitally or print the maps for distribution to customers. This chapter looks at the use of supply chain management to manage and improve decentralized map production and to exceed customer expectations using the Supply-Chain Operations Reference (SCOR) model as the supply chain modeling tool.
Peter Schmitz

Chapter 20. Characterizing Potential User Groups for Versioned Geodata

We explore the characteristics of different user groups for legacy geodata from the perspective of a long term archive. For the sake of this study, legacy geodata has been defined as all digital information useful for map creation including aerial photography, digital elevation models, LIDAR data, vector data bases etc., of which there exists at least one more recent version with the same characteristics. In the context of the ISO standard for open archival information systems (OAIS) potential user groups are called designated communities. The archive is supposed to adapt its service to their profiles and needs, which in the electronic environment includes taking into account their level of knowledge of technical aspects. A future technique, more precisely a Delphi study, has been used to predict the potential user groups and their need for geodata versions. In two rounds, two international Delphi groups have been questioned about user professions, frequency of access, amount of data needed, knowledge of GIS, age of the data they are interested in, preferred data set, scales, snapshot intervals and file formats. The answers allowed us to identify the following user types: geophysicists, commercial users, lawyers, policy makers, emergency response planning teams, architects and geo-related engineers, social scientists, the general public, archaeologists, historians, culture and arts professionals, conservation agents of the built and the natural environment, geodata creators and undergraduate teachers and students. We classified the user types by their characteristics into six clusters. The application of the user profiles showed that the method did not deliver sufficiently detailed answers for complying with all OAIS requirements, but that it was effective for gathering user characteristics which guide archives in strategic decisions about the designated communities they might serve.
Anita E. Locher
Weitere Informationen

Premium Partner