Skip to main content
main-content

Über dieses Buch

This book constitutes the refereed proceedings of the 11th IFIP WG 5.11 International Symposium on Environmental Software Systems, ISESS 2015, held in Melbourne, Australia, in March 2015. The 62 revised full papers presented were carefully reviewed and selected from 104 submissions. The papers are organized in the following topical sections: information systems, information modeling and semantics; decision support tools and systems; modelling and simulation systems; architectures, infrastructures, platforms and services; requirements, software engineering and software tools; analytics and visualization; and high-performance computing and big data.

Inhaltsverzeichnis

Frontmatter

Keynotes and Context Articles

A Provenance Maturity Model

The history of a piece of information is known as “provenance”. From extensive interactions with hydro-and geo-scientists in Australian science agencies we found both widespread demand for provenance and widespread confusion about how to manage it and how to develop requirements for managing it.

We take inspiration from the well-known software development Capability Maturity Model to design a Maturity Model for provenance management that we call the PMM. The PMM can be used to assess the state of existing practices within an organisation or project, to benchmark practices and existing tools, to develop requirements for new provenance projects, and to track improvements in provenance management across an organisational unit.

We present the PMM and evaluate it through application in a workshop of scientists across three data-intensive science projects. We find that scientists recognise the value of a structured approach to requirements elicitation that ensures that aspects are not overlooked.

Kerry Taylor, Robert Woodcock, Susan Cuddy, Peter Thew, David Lemon

Challenges in Modelling of Environmental Semantics

Modelling environmental semantics is a prerequisite for model and data interoperabilty and reuse, both essential for integrated modelling. This paper previews a landscape where integrated modelling activities are performed in a virtual environmental information space, and identifies challenges imposed by the nature of integrated modelling tasks and new technology drivers such as sensor networks, big data and high-performance computing. A set of requirements towards a universal framework for sharing environmental data and models is presented. The approach is demonstrated in the case study of a semantic modelling system for wildlife monitoring, management and conservation.

Ioannis N. Athanasiadis

Topics in Environmental Software Systems

Environmental software systems (ESS) are software systems supporting activities of environmental protection, environmental management, environmental policy and environmental sciences. ESS often overlap with adjacent application fields like security, agriculture, health or climate change. The ISESS conference series is one of several overlapping events which are devoted to environmental modelling and software systems. ISESS is probably the one international event which has the strongest focus on the software angle.

This paper gives a historic perspective of 20 years of ISESS conferences. Starting with a historic review of the development of the field named “environmental software systems”, it puts the ISESS conference into the greater perspective of similar activities. Using the original materials the paper identifies typical themes subsumed under ESS, and highlights typical topics for the time.

The paper is the only existing complete collection of activities of the IFIP working group 5.11 “Computers and Environment”. Material of all events, ISESS conferences as well as co-organised workshops has been collected in one central place and will also to be kept up to date in the future.

Ralf Denzer

The Framework for Environmental Software Systems of the European Environment Agency

The European Environment Agency (EEA) is the authoritative European environment node and hub, and a key initiator within networks of knowledge co-creation, sharing and use in European Union (EU). It ensures the quality, availability and accessibility of environmental data and information needed to support strategic area: informing policy implementation and assessing systemic challenges. It actively communicates data, information and knowledge to policymakers, the public, research communities and others (non-governmental organizations, businesses) as well as to regional and international processes including those of the United Nation and its specialised agencies and promotes information governance as a driver of public empowerment and behavioural change. In the past few years the EEA’s environmental information systems as well as environmental modelling with the support of environmental software systems have been supporting decision making processes within the EU Systems have undergone rapid development and grew up to support the knowledgebase of European Commission and EU Member States. Specifically, new infrastructure to support supply services (collection of data); networking (knowledge management); workflows (planning, automation, quality management); development of final products and public services (reports, web sites, public data and information services) were put in place. EEA strengthens the infrastructure for environmental data and information sharing both at the EEA and in the European Environment Information and Observation Network with cooperating countries too, taking into account the Shared Environmental Information System (SEIS) and the Infrastructure for Spatial Information in Europe (INSPIRE) developments. The paper presents the framework for the of development of EEA environmental software systems and information services accepted in Multiannual Working Plan of the EEA for 2014 – 2018 and its implication for ICT.

Jiří Hřebíček, Stefan Jensen, Chris Steenmans

Crowdsourcing in Crisis and Disaster Management – Challenges and Considerations

With the rise of social media platforms, crowdsourcing became a powerful tool for mobilizing the public. Events such as the earthquake in Haiti or the downfall of governments in Libya and Egypt indicate its potential in crisis situations. In the scope of this paper, we discuss the relevance of crowdsourcing in the area of crisis and disaster management (CDM). Starting with a general overview of the topic, we distinguish between different types of crowds and crowdsourcing and define what is meant by crowdtasking in the area of CDM. After considering technological, societal and ethical challenges for using crowdsourcing in crisis management, applications of crowdsourcing tools in ongoing projects are described and future developments outlined.

Gerald Schimak, Denis Havlik, Jasmin Pielorz

Evolution of Environmental Information Models

Access to environmental data, based on standardized data models and services, is becoming ever more prevalent, providing stakeholders with access to a wide range of standardized environmental data from diverse sources. However, exactly this success brings new problems, with thematic extensions based on these standardized models being created by disparate thematic communities based on their specific requirements. In contrast to the traditional standards development process, which includes mechanisms for maintaining alignment of concepts across different sections of the standard, once these standards are extended by a larger and not so strictly structured community, the alignment process becomes increasingly difficult. This position paper sketches this problem, as illustrated by example of the European INSPIRE process, and serves as a basis for the conference workshop discussion that aims to capture both further facets of the problem as well as possible solutions.

Katharina Schleidt

Information Systems, Information Modelling and Semantics

An Interactive Website for the River Eurajoki

The project “Our Common River Eurajoki” aims for a long-term sustainable and collaborative platform for improving and managing the river water quality. We describe a website that supports the project by linking social media to environmental monitoring and web mapping. The website development was based on five-tiered software architecture and on several systems working together. Coding was minimized by re-using existing software components. We discuss the workflows supported by the website and the role of information system standards in environmental monitoring.

Ari Jolma, Anne-Mari Ventelä, Marjo Tarvainen, Teija Kirkkala

An Information Model for a Water Information Platform

Sharing of open government data is amongst other reasons hindered by incompatibility of data models in different data collections. Only a few areas in the environmental domain have progressed towards commonly used data models. The purpose of this paper is to share with the community a data model which is used in a spatial information platform being built for the purpose of sharing open government data in the domain of water sciences. The objective when building the information model was not to be restricted to one and only one (meta)data standard. The information model therefore uses several standards and extension mechanisms: the ISO19000 series, the Comprehensive Knowledge Archive Network (CKAN), dynamic tag extension and dynamic content extension. The CKAN domain model can also be mapped to semantic-web-compatible standards like Dublin Core and the Data Catalogue Vocabulary of the World Wide Web Consortium.

Pascal Dihé, Ralf Denzer, Sascha Schlobinski

Towards Linked Data Conventions for Delivery of Environmental Data Using netCDF

netCDF is a well-known and widely used format to exchange array-oriented scientific data such as grids and time-series. We describe a new convention for encoding netCDF based on Linked Data principles called netCDF-LD. netCDF-LD allows metadata elements, given as string values in current netCDF files, to be given as Linked Data objects. netCDF-LD allows precise semantics to be used for elements and expands the type options beyond lists of controlled terms. Using Uniform Resource Identifiers (URIs) for elements allows them to refer to other Linked Data resources for their type and descriptions. This enables improved data discovery through a generic mechanism for element type identification and adds element type expandability to new Linked Data resources as they become available. By following patterns already established for extending existing formats, netCDF-LD applications can take advantage of existing software for processing Linked Data and supporting more effective data discovery and integration across systems.

Jonathan Yu, Nicholas J. Car, Adam Leadbetter, Bruce A. Simons, Simon J. D. Cox

Information Technology and Solid Residue Management: A Case of Study Using Freeware and Social Networks

Separation, collection, processing and disposal of waste considered recyclable are currently one of the greatest challenges of human beings in their quest for sustainability. With consumption levels in ever-higher levels, the reutilization of recyclable waste, which would likely target the garbage, becomes an obligation of society as a whole. Thus, addressing in an appropriate manner the issue of selective collection of recyclable materials can contribute to the

solution of several problems associated with it – mostly under the views of

environmental, social and economic ways. This way, the Project iCARE

(Instrumentation for the Assisted Collection of Recyclable Waste) is based on an experiment that aims to provide computational tools that could significantly contribute to the issue of the reuse of solid recyclable waste. Specifically,

the iCARE software aims to contribute to the integration between donors and collectors of recyclable materials, establishing a communication channel where ordinary people or companies may announce the availability of recyclables to collectors. These, in turn, can check through the same software the availability of certain wastes, coming to contact consumers for possible collection and

subsequently forward the waste to processing companies. This software also

offers a "routing module", which allows the collector to establish an optimized collection route according to certain criteria. All software developed in iCARE’s framework must provide a user friendly interface, making its use very simple. iCARE is a project of free use and also a social tool, which seeks to contribute to the solution of the complex problem of disposal of recyclable waste.

José Tarcísio Franco de Camargo, Estéfano Vizconde Veraszto, Adriano Aparecido Lopes, Tainá Ângela Vedovello Bimbati

Joining the Dots: Using Linked Data to Navigate between Features and Observational Data

Information about localized phenomena may be represented in multiple ways. GIS systems may be used to record the spatial extent of the phenomena. Observations about the state of one or more properties of the phenomena are available from real-time sensors, models, or from archives. The relationships between these data sources, or specific features in different data products, cannot easily be specified. Additionally, features change over time, their representations use different spatial scales and different aspects of them are of concern to different stakeholders. This greatly increases the number of potential relationships between features. Thus, for a given feature we can expect that heterogeneous information systems will exist, holding different types of data related to that feature. We propose the use of Linked Data to describe the relationships between them. We demonstrate this in practice using the Australian Hydrologic Geospatial Fabric (Geofabric) feature dataset and observational data of varying forms, including time-series and discrete measurements. We describe how different resources, and different aspects and versions of them, can be discovered and accessed. A web client is described that can navigate between related resources, including using the Geofabric’s feature relationships, to navigate from one observational dataset to another related by hydrological connectivity.

Robert A. Atkinson, Peter Taylor, Geoffrey Squire, Nicholas J. Car, Darren Smith, Mark Menzel

An SMS and Email Weather Warning System for Sheep Producers

Sheep are vulnerable to hypothermia shortly after birth and shearing. Since the 1970’s sheep weather alerts have been reported at a regional scale by the media up to 24 hours prior to a chill event. The SMS and email weather warning system was designed as an enhanced service to provide sheep producers with advanced warnings of forth-coming chill events, based on local weather forecasts, with personalized chill warnings delivered by SMS and email. A trial was conducted with 30 sheep producers who selected one or more local weather stations and a low, medium or high sensitivity threshold to control the frequency at which messages were sent. Sensitivity thresholds were calculated for each weather station from historical data. Numerical forecast data were sourced from the Bureau of Meteorology, and an email and SMS sent each morning whenever forecast chill exceeded the warning threshold within the 7-day forecast period. Participants were interviewed by telephone after a 2-month trial. The alerts were found to be clear and reasonably accurate, but produced an unexpected high number of false warnings at some sites. The SMS format was well received, and farmers were generally happy to continue the trial. False warnings were attributed to over-prediction of wind speeds at some sites relative to on-ground weather stations, most of which were in northern Victoria.

Anna Weeks, Malcolm McCaskill, Matthew Cox, Subhash Sharma

The Emergency Response Intelligence Capability Tool

The Emergency Response Intelligence Capability (ERIC) tool, http://eric.csiro.au, automatically gathers data about emergency events from authoritative web sources, harmonises the information content and presents it on an interactive map. All data is recorded in a database which allows the changing status of emergency events to be identified and provides an archive for historical review.

ERIC was developed for the Australian Government Department of Human Services Emergency Management team who is responsible for intelligence gathering and situation reporting during emergency events. Event information is combined with demographic data to profile the affected community. Identifying relevant community attributes, such as languages spoken or socioeconomic information, allows the department to tailor its response appropriately to better support the impacted community.

An overview of ERIC is presented, including its use by the department and the difficulties overcome in establishing and maintaining a nationally consistent harmonised model of emergency event information. Preliminary results of republishing the emergency event information using the Australian Profile of the Common Alerting Protocol, an XML standard to facilitate the construction and exchange of emergency alert and warning messages, are also presented.

Robert Power, Bella Robinson, Catherine Wise, David Ratcliffe, Geoffrey Squire, Michael Compton

Civic Issues Reporting and Involvement of Volunteers as a Phenomenon in the Czech Republic

Today’s smartphones can unlock the full potential of crowdsourcing and take eParticipation to a new level. Users are allowed to transparently contribute to complex and novel problem solving. Engagement of citizens is still challenging but the proliferation of smartphones with geolocation have made it easier than before. In this paper we will introduce the ZmapujTo project - a reporting platform in the form of a mobile application and responsive web page intended for citizens to report civic issues. We will also describe used technology stack and system architecture. In addition, we will introduce “Ukliďme Česko” (Clean up the Czech Republic) which is a maiden event taking place in the Czech Republic. We explain how ZmapujTo is applied in this event and the ICT tools that are offered to volunteers and organizers for communication and management of the event.

Miroslav Kubásek

Mobile Field Data Collection for Post Bushfire Analysis and African Farmers

In recent years CSIRO has been trialling field data collection using mobile devices such as phones and tablets. Two recent tools that have been developed by CSIRO are the CSIRO Surveyor (Post Bushfire House Surveyor) and DroidFarmer. Challenges tackled include mapping field documents to mobile data through QR (Quick Response) codes, rapid input of survey data, accurate capture of GPS locations and offline operation. Throughout this paper we detail the design choices made for these systems. We give details of how well field data collection was performed and discuss our planned future developments in this space.

Bradley Lane, Nicholas J. Car, Justin Leonard, Felix Lipkin, Anders Siggins

Provenance in Systems for Situation Awareness in Environmental Monitoring

As environmental monitoring systems increasingly automate the collection and processing of environmental sensor network data, the technical components of such systems can automatically obtain and maintain higher levels of situation awareness—awareness of the monitored part of reality. In order to increase confidence in the correctness of situation awareness maintained by such systems it is important to explicitly model provenance. We present an alignment of the PROV ontology with ontologies used in a software framework for situation awareness in environmental monitoring, called Wavellite. The extended vocabulary enables the explicit representation of provenance in Wavellite applications. We demonstrate the implementation for a concrete scenario.

Markus Stocker, Mauno Rönkkö, Mikko Kolehmainen

Decision Support Tools and Systems

Decision Making and Strategic Planning for Disaster Preparedness with a Multi-Criteria-Analysis Decision Support System

In the context of the CRISMA FP7 project we have developed a seamless decision support concept to connect simulated crisis scenarios and aggregated performance indicators of impact scenarios with state of the art Multi-Criteria Decision Analysis (MCDA) methods. To prove the practicality of the approach we have developed a decision support tool realising the important aspects of the method. The tool is a highly interactive and user-friendly decision support system (DSS) that effectively helps the decision maker and strategic planner to perform multi-criteria ranking of scenarios. The tool is based on state-of-the-art web technologies.

Sascha Schlobinski, Giulio Zuccaro, Martin Scholl, Daniel Meiers, Ralf Denzer, Sergio Guarino, Wolf Engelbach, Kuldar Taveter, Steven Frysinger

A Cotton Irrigator’s Decision Support System and Benchmarking Tool Using National, Regional and Local Data

We are developing a smart phone application that provides irrigation water management advice using satellite imagery, weather stations and field-scale farmer provided data.

To provide tailored advice we use high resolution satellite imagery with national coverage provided by Google Earth Engine services to estimate field-specific crop growth information – crop coefficients – and we are among the first systems to do so. These coefficients combined with regional scale weather station data for major cotton growing regions and farmer-supplied data means we can run daily water balance calculations for every individual cotton field in Australia and provide irrigation decision support advice.

We are using automated data processing to ensure the latest satellite and weather data is used for advice without manual effort.

We will also deliver benchmarking data to farmers based on their previous seasons as well as peers’ farms in order to compare absolute (calculated) and relative (benchmarked) advice.

Jamie Vleeshouwer, Nicholas J. Car, John Hornbuckle

Water Pollution Reduction: Reverse Combinatorial Auctions Modelling Supporting Decision-Making Processes

This paper presents a model that contributes to finding cost-effective solutions when making decisions about building wastewater treatment plants in the planning process defined in the Framework Directive 2000/60/EC of the European Parliament and of the European Council. The model is useful especially when construction and operation of joint wastewater treatment plants is possible for several (neighbouring) municipalities, where a huge number of theoretical coalitions is possible. The paper presents the model principles for one pollutant and for multiple pollutants, describes the CRAB software used for computing the optimal solutions and presents selected applications. It concludes that the computations can contribute directly to decision-making concerning environmental protection projects and also serve for calculating background models for economic laboratory experiments in the area.

Petr Šauer, Petr Fiala, Antonín Dvořák

Scenario Planning Case Studies Using Open Government Data

The opportunity for improved decision making has been enhanced in recent years through the public availability of a wide variety of information. In Australia, government data is routinely made available and maintained in the http://data.gov.au repository. This is a single point of reference for data that can be reused for purposes beyond that originally considered by the data custodians. Similarly a wealth of citizen information is available from the Australian Bureau of Statistics. Combining this data allows informed decisions to be made through planning scenarios.

We present two case studies that demonstrate the utility of data integration and web mapping. As a simple proof of concept the user can explore different scenarios in each case study by indicating the relative weightings to be used for the decision making process. Both case studies are demonstrated as a publicly available interactive map-based website.

Robert Power, Bella Robinson, Lachlan Rudd, Andrew Reeson

Training Support for Crisis Managers with Elements of Serious Gaming

This paper presents a methodology and a prototypic software implementation of a simple system supporting resource management training for crisis managers. The application that is presented supports the execution and assessment of a desktop training for decision makers on a tactical and strategic level. It introduces elements of turn-based strategic “serious gaming”, with a possibility to roll back in time and re-try new decision paths, while keeping the graphical user interface as simple as possible. Consequently, the development efforts concentrated on: (1) formulating and executing crisis management decisions; (2) assuring responses of all simulated entities adhere to natural laws of the real world; and (3) analyzing progress and final results of the training exercise. The paper presents the lessons learned and discusses the transferability and extensibility of the proposed solution beyond the initial scenario involving accidental release of toxic gas in an urban area in Israel.

Denis Havlik, Oren Deri, Kalev Rannat, Manuel Warum, Chaim Rafalowski, Kuldar Taveter, Peter Kutschera, Merik Meriste

A Software System for the Discovery of Situations Involving Drivers in Storms

We present an environmental software system that obtains, integrates, and reasons over situational knowledge about natural phenomena and human activity. We focus on storms and driver directions. Radar data for rainfall intensity and Google Directions are used to extract situational knowledge about storms and driver locations along directions, respectively. Situational knowledge about the environment and about human activity is integrated in order to infer situations in which drivers are potentially at higher risk. Awareness of such situations is of obvious interest. We present a prototype user interface that supports adding scheduled driver directions and the visualization of situations in space-time, in particular also those in which drivers are potentially at higher risk. We think that the system supports the claim that the concept of situation is useful for the modelling of information about the environment, including human activity, obtained in environmental monitoring systems. Furthermore, the presented work shows that situational knowledge, represented by heterogeneous systems that share the concept of situation, is relatively straightforward to integrate.

Markus Stocker, Okko Kauhanen, Mikko Hiirsalmi, Janne Saarela, Pekka Rossi, Mauno Rönkkö, Harri Hytönen, Ville Kotovirta, Mikko Kolehmainen

Modelling and Simulation Systems

An Application Framework for the Rapid Deployment of Ocean Models in Support of Emergency Services: Application to the MH370 Search

Ocean models are beneficial to many different applications, including industry, public-good, and defence. Many applications use high-resolution models to produce detailed maps of the ocean circulation. High-resolution models are historically time-consuming to configure – often taking weeks to months to properly prepare. A system for automatically configuring and executing a model to predict the past, present, or future state of the ocean – named TRIKE – has been developed. TRIKE includes a sophisticated user interface allowing a user to easily set a model in minutes, control data management and execute the model run on a supercomputer almost instantly. TRIKE makes it feasible to configure and execute high-resolution regional models anywhere in the world at a moments notice. TRIKE was used to support the recent search for MH370 that is thought to have crashed somewhere in the South Indian Ocean.

Uwe Rosebrock, Peter R. Oke, Gary Carroll

Exposure Modeling of Traffic and Wood Combustion Emissions in Northern Sweden

Application of the Airviro Air Quality Management System

Traffic and residential wood combustion (rwc) constitute the two dominating local sources to fine particulate matter PM2.5 concentration levels in Sweden. In order to meet the authorities’ requirements of air quality assessments, a national modelling system SIMAIR has been developed. The system is based on the commercial Airviro air quality management software, a three tier client/server/web system which includes modules for measurement data collection, emission databases and dispersion models with very high performance in terms of data access and model execution. The technical characteristics of Airviro databases and models have facilitated web based national air quality systems, of which some examples are given.

The present Airviro/SIMAIR application had the objective to assess the impact of rwc in three urbanized areas in northern Sweden. The Airviro Scenario module was used to determine exposure and health impact of the rwc contribution. The estimated mortality due to PM2.5 concentrations from residential wood combustion is about 4 persons/year, which corresponds to approximately 0.4% of the total number of deaths (excluding accidents). Cities which have well established district heating facilities have a lower rwc use and a very different exposure to locally generated PM2.5. Umeå, one of the three areas in the study, is such a city. A similar assessment with impact only from traffic emissions shows an increase of 4.4 deaths for the Umeå population, while the impact of wood combustion in the city contributes with 2.5 deaths per year. The advantages of using the Airviro software in combination with annually updated databases for input data on the national scale, are summarized. The approach facilitates for municipal end users and non-meteorological professionals like epidemiologists to perform by themselves advanced dispersion simulations and health impact assessments.

Lars Gidhagen, Cecilia Bennet, David Segersson, Gunnar Omstedt

Medium-Term Analysis of Agroecosystem Sustainability under Different Land Use Practices by Means of Dynamic Crop Simulation

The role of dynamic crop models as an intellectual core of computer decision support systems in agricultural management increases significantly in recent time. However, the scope of model applications is often limited by short time scale i.e. crop simulation/forecasting is performed within a particular vegetation season. The use of dynamic models in long-term planning is still much less developed. This contribution presents the author’s efforts in development and improvement of the integrated system of crop simulation «APEX-AGROTOOL» for its use as a tool of model-oriented analysis of land use environmental sustainability. Attention is paid to the modification of the existing software in order to provide an ability to simulate agro-landscape dynamics taking into account crop rotation effects.

Sergey Medvedev, Alex Topaj, Vladimir Badenko, Vitalij Terleev

SPARK – A Bushfire Spread Prediction Tool

Bushfires are complex processes, making it difficult to accurately predict their rate of spread. We present an integrated software system for bushfire spread prediction, SPARK, which was developed with the functionality to model some of these complexities. SPARK uses a level set method governed by a user-defined algebraic spread rate to model fire propagation. The model is run within a modular workflow-based software environment. Implementation of SPARK is demonstrated for two cases: a small-scale experimental fire and a complex bushfire scenario. In the second case, the complexity of environmental non-homogeneity is explored through the inclusion of local variation in fuel and wind. Simulations over multiple runs with this fuel and wind variation are aggregated to produce a probability map of fire spread over a given time period. The model output has potential to be used operationally for real-time fire spread modeling, or by decision makers predicting risk from bushfire events.

Claire Miller, James Hilton, Andrew Sullivan, Mahesh Prakash

Construction of a Bio-economic Model to Estimate the Feasibility and Cost of Achieving Water Quality Targets in the Burnett-Mary Region, Queensland

The aim of this study was to develop a bio-economic model to estimate the feasibility and net profit (or net costs)s of achieving set water quality targets (sediment, nitrogen and phosphorus load reductions) in the Burnett Mary region of northern Queensland, Australia to with the aim of protecting the southern portion of the Great Barrier Reef (GBR). Two sets of targets were evaluated – Reef Plan Targets (RPTs) which are the currently formally agreed targets, and more ambitious Ecologically Relevant Targets (ERTs) which current science suggests might be needed to better protect the values of the GBR. This paper describes the construct of a bio-economic optimisation framework which has been used to underpin a Water Quality Improvement Plan (WQIP) for the Burnett Mary region. The bioeconomic model incorporates the available science developed from paddock and catchment scale biophysical model results and farm economic analysis. The model enabled transparent assessment and optimisation of net profits and costs associated with four categories of best management practices (cutting edge unproven technologies called ‘A’ practice, current best-management practices called ‘B’, common industry or ‘C’ practices, and below industry standards or ‘D’ practice) in the grazing and sugar cane industries. The bioeconomic model was able to solve for RPTs or ERTs assigned to either the entire region or within each of five discrete river basins. Key outcomes from the study were that RPTs could be achieved at an annual cost of $3M/year on a whole of region basis. In contrast ERTs could be achieved on a whole of region basis at as net cost of $16M/year. ERTs were not able to be feasibly met on a basin by basin basis. This is the first time such a comprehensive and integrated bio-economic model has been constructed for a region within GBR using environmental software that linked available biophysical and economic modelling.

Craig Beverly, Anna Roberts, Geoff Park, Fred Bennett, Graeme Doole

Integrating Hydrodynamic and Hydraulic Modeling for Evaluating Future Flood Mitigation in Urban Environments

We present an integrated flood modelling tool that is able to evaluate different mitigation solutions for areas that are prone to floods from storm surge and heavy rainfall. Our model integrates catchment and coastal flood modelling (spatio-temporally dynamic), including sea level rise, to provide a holistic inundation model for future flooding. Additionally, the model aims to enable simulation of a combination of flood mitigation and adaptation options. To date, the practice has been to model either drainage augmentation solutions alone, or (for coastal inundation) single coastal adaptation solutions. This tool aims to deliver the capacity to model a range of both coastal and drainage adaptation solutions to understand what combination of solutions might be effective. The model is demonstrated for example cases in the City of Port Phillip, Melbourne. Three mitigation strategies involving the use of a hypothetical off-shore reef and the combination of a single valve systems and retention/detention measures are evaluated for the region around Elwood canal for current and future scenarios.

Mahesh Prakash, James Hilton, Lalitha Ramachandran

Modelling of Air Flow Analysis for Residential Homes Using Particle Image Velocimetry

The purpose of this paper is to simulate the designed physical Particle Image Velocimetry (PIV) system, as a simulation platform to physically build and implement a system for residential building and housing research. The focus is on the angle filter; the filter used to process the images of a laser progressively scanning through a space by adjusting its angle.

An indicator of heat escape and ventilation in buildings is the airflow itself. Conventional airflow measurement techniques are typically intrusive, interfering with the data or the environment. For small flows such as that in residential housing, the error introduced can sometimes be large relative to the measured data. In contrast, PIV is a relatively a non-intrusive measurement tool that measures flows. However, there are a few problems with standard PIV techniques, for implementation in an attic space.

The proposed solution is to use dust particles, already present in the air, as tracers for the PIV system. In conclusion, our PIV system with a non-diverging laser beam produces a velocity field of similar quality to a velocity field of a standard PIV system.

Rajiv Pratap, Ramesh Rayudu, Manfred Plagmann

Open Data Sources for the Development of Mobile Applications and Forecast of Microbial Contamination in Bathing Waters

This paper describes a service oriented architecture for mobile and web applications and the enablement of participatory observations of the environment. The architecture hosts generic microbial risk forecast models in bathing zones, which are trained by heterogeneous input data. Open observation data sources, specializing in water quality indicators and environmental processes are used for the construction of such applications. Nevertheless, the encountered integration of the open data sources was challenging due to the various incompatibilities found in data samples. These included gaps in the data with diverse temporal and spatial coverage as well as conflicting collection policies.

Gianluca Correndo, Zoheir A. Sabeur

Ecohydrology Models without Borders?

Using Geospatial Web Services in EcohydroLib Workflows in the United States and Australia

Ecohydrology models require diverse geospatial input datasets (e.g. terrain, soils, vegetation species and leaf area index), the acquisition and preparation of which are labor intensive, yielding workflows that are difficult to reproduce. EcohydroLib is a software framework for managing spatial data acquisition and preparation workflows for ecohydrology modeling, while automatically capturing metadata and provenance information. The goal of EcohydroLib is to enable water scientists to spend less time acquiring and manipulating geospatial data and more time using ecohydrology models to test hypotheses, while making it easier for models to be shared and scientific results to be reproduced. This increased reproducibility, ease of sharing, and researcher productivity can enable both model inter comparison of interest within a country, and site inter comparison of interest across national borders. Currently, EcohydroLib allows modelers to work with geospatial data stored locally as well as high spatial resolution U.S. national spatial data available via web services, for example 30-meter digital elevation model and land cover data, and 1:12,000 scale soils data. While researchers working in watersheds outside the U.S. can use EcohydroLib, they must manually download data for their study areas before these data can be imported into EcohydroLib workflows. Though national agencies in the U.S. and Australia offer some datasets via web services, with a few exceptions these are either lower resolution datasets or data made available via Open Geospatial Consortium (OGC) Web Map Service (WMS) interfaces of use primarily for cartography, rather than via OGC Web Coverage Service (WCS) or Web Feature Service (WFS) interfaces needed for integration with numerical models. In this paper we explore: (1) availability of high-resolution national geospatial data web services in the United States and Australia; and (2) integration of Australian web services with EcohydroLib.

Brian Miles, Lawrence E. Band

Architectures, Infrastructures, Platforms and Services

A Distributed Computing Workflow for Modelling Environmental Flows in Complex Terrain

Numerical modelling of extreme environmental flows such as flash floods, avalanches and mudflows can be used to understand fundamental processes, predict outcomes and assess the loss potential of future events. These extreme flows can produce complicated and dynamic free surfaces as a result of interactions with the terrain and built environment. In order to resolve these features that may affect flows, high resolution, accurate terrain models are required. However, terrain models can be difficult and costly to acquire, and often lack detail of important flow steering structures such as bridges or debris. To overcome these issues we have developed a photogrammetry workflow for reconstructing high spatial resolution three dimensional terrain models. The workflow utilises parallel and distributed computing to provide inexpensive terrain models that can then be used in numerical simulations of environmental flows. A section of Quebrada San Lazaro within the city of Arequipa, Peru is used as a case study to demonstrate the construction and usage of the terrain models and applicability of the workflow for a flash flood scenario.

Stuart R. Mead, Mahesh Prakash, Christina Magill, Matt Bolger, Jean-Claude Thouret

An Integrated Workflow Architecture for Natural Hazards, Analytics and Decision Support

We present a modular workflow platform for the simulation and analysis of natural hazards. The system is based on the CSIRO Workspace architecture, allowing transparent data interoperability between input, simulation and analysis components. The system is currently in use for flooding and inundation modelling, as well as research into bushfire propagation. The modularity of the architecture allows tools from multiple development groups to be easily combined into complex workflows for any given scenario. Examples of platform usage are demonstrated for saline intrusion into an environmentally sensitive area and image analysis inputs into a fire propagation model.

James Hilton, Claire Miller, Matt Bolger, Lachlan Hetherton, Mahesh Prakash

Quality Control of Environmental Measurement Data with Quality Flagging

We discuss quality control of environmental measurement data. Typically, environmental data is used to compute some specific indicators based on models, historical data, and the most recent measurement data. For such a computation to produce reliable results, the data must be of sufficient quality. The reality is, however, that environmental measurement data has a huge variation in quality. Therefore, we study the use of quality flagging as a means to perform both real-time and off-line quality control of environmental measurement data. We propose the adoption of the quality flagging scheme introduced by the Nordic meteorological institutes. As the main contribution, we present both a uniform interpretation for the quality flag values and a scalable Enterprise Service Bus based architecture for implementing the quality flagging. We exemplify the use of the quality flagging and the architecture with a case study for monitoring of built environment.

Mauno Rönkkö, Okko Kauhanen, Markus Stocker, Harri Hytönen, Ville Kotovirta, Esko Juuso, Mikko Kolehmainen

Towards a Search Driven System Architecture for Environmental Information Portals

In order to merge data from different information systems in web portals, querying of this data has to be simple and with good performance. If no direct, high-performance query services are available, data access can be provided (and often accelerated) using external search indexes, which is well-proven for unstructured data by means of classical full text search engines. This article describes how structured data can be provided through search engines, too, and how this data then can be re-used by other applications, e.g., mobile apps or business applications, incidentally reducing their complexity and the number of required interfaces. Users of environmental portals and applications can benefit from an integrated view on unstructured as well as on structured data.

Thorsten Schlachter, Clemens Düpmeier, Oliver Kusche, Christian Schmitt, Wolfgang Schillinger

National Environmental Data Facilities and Services of the Czech Republic and Their Use in Environmental Economics

National environmental data facilities and services are part of the environmental information systems of the Ministry of the Environment of the Czech Republic that have been under development since 1990. In 2010 the development of the National Information System for Collecting and Evaluating Information on Environmental Pollution project started, co-financed by the European Regional Development Fund. This project consists of an integrated system of reporting (ISPOP), an environmental help desk (EnviHELP), and the national INSPIRE geoportal, which were developed between 2010 and 2013 and were discussed at ISESS 2013. This paper introduces the current development of several national environmental and financial data facilities and services based on eGovernment implementation in the Czech Republic and the open environmental and financial data approach of the Czech Ministry of the Environment and the Czech Ministry of Finance. It also introduces the web information system that enabled us to find the relationship between environmental economics and municipal waste management in the Czech Republic.

Jana Soukopová, Jiří Hřebíček, Jiří Valta

A Best of Both Worlds Approach to Complex, Efficient, Time Series Data Delivery

Point time series are a key data-type for the description of real or modelled environmental phenomena. Delivering this data in useful ways can be challenging when the data volume is large, when computational work (such as aggregation, subsetting, or re-sampling) needs to be performed, or when complex metadata is needed to place data in context for understanding. Some aspects of these problems are especially relevant to the environmental domain: large sensor networks measuring continuous environmental phenomena sampling frequently over long periods of time generate very large datasets, and rich metadata is often required to understand the context of observations. Nevertheless, timeseries data, and most of these challenges, are prevalent beyond the environmental domain, for example in financial and industrial domains.

A review of recent technologies illustrates an emerging trend toward high performance, lightweight, databases specialized for time series data. These databases tend to have non-existent or minimalistic formal metadata capacities. In contrast, the environmental domain boasts standards such as the Sensor Observation Service (SOS) that have mature and comprehensive metadata models but existing implementations have had problems with slow performance.

In this paper we describe our hybrid approach to achieve efficient delivery of large time series datasets with complex metadata. We use three subsystems within a single system-of-systems: a proxy (Python), an efficient time series database (InfluxDB) and a SOS implementation (52 North SOS). Together these present a regular SOS interface. The proxy processes standard SOS queries and issues them to the either 52 North SOS or to InfluxDB for processing. Responses are returned directly from 52 North SOS or indirectly from InfluxDB via Python proxy where they are processed into WaterML. This enables the scalability and performance advantages of the time series database to be married with the sophisticated metadata handling of SOS. Testing indicates that a recent version of 52 North SOS configured with a Postgres/PostGIS database performs well but an implementation incorporating InfluxDB and 52 North SOS in a hybrid architecture performs approximately 12 times faster.

Benjamin Leighton, Simon J. D. Cox, Nicholas J. Car, Matthew P. Stenson, Jamie Vleeshouwer, Jonathan Hodge

Implementing a Glossary and Vocabulary Service in an Interdisciplinary Environmental Assessment for Decision Makers

When delivering scientific information for decision makers, it is important to define and use appropriate terminology to ensure scientific credibility and good communication. A glossary with terms from authoritative sources for specific domains can increase the usefulness and reusability of information for decision makers as the information can be more easily used without adaptation or translation. Linked Data principles and semantic web-based vocabulary tools provide mechanisms for delivering formalised glossaries via vocabulary services for use in integrated products, both documents and information platforms.

Issues to consider when implementing a glossary and vocabulary service are covered: persuading stakeholders to accept standard external terms and gain agreement on unique terminology; requirements for gathering, controlling and maintaining terminology in a glossary to ensure transparency and persistence; formalising a glossary as a standards-based vocabulary; and efficiently implementing this glossary via automation.

Simon N. Gallant, Rebecca K. Schmidt, Nicholas J. Car

Requirements, Software Engineering and Software Tools

Requirement Engineering for Emergency Simulations

Paper deals with requirement engineering that is used in the first phase of project SIMEX (R

esearch

and

development

of

simulation tools

for

training

cooperation of

actors

in

emergency

management

by

subjects of critical

infrastructure.

Project SIMEX focuses on the development of simulation tools and instruments for common interoperability training of crisis staff managers, security advice bodies and liaison safety employees in the energy sector with the integrated rescue system in dealing with emergencies and their consequences. This paper also deals with the analysis of different national systems providing publicly available information and evaluates the usability and benefits of implementation of information generated into simulation tool.

Alena Oulehlová, Jiří Barta, Hana Malachová, Jiří F. Urbánek

Requirements Engineering for Semantic Sensors in Crisis and Disaster Management

This paper describes the requirements engineering methodology used for the definition of semantic sensors in a Crisis and Disaster Management framework. The goal of the framework is effective management of emergencies which depends on timely information availability, reliability and intelligibility. To achieve this, different Command and Control (C2) Systems and Sensor Systems have to cooperate and interoperate. Unless standards and well-defined specifications are used, however, the interoperability of these systems can be very complex. To address this challenge, in the C2-SENSE project, a “profiling” approach will be used to achieve seamless interoperability by addressing all the layers of the communication stack in the security field. The main objective is to develop a profile based Emergency Interoperability framework by the use of existing standards and semantically enriched Web services to expose the functionalities of C2 Systems, Sensor Systems and other Emergency and Crisis Management systems. We introduce the concepts of Semantic Sensors, describe the characteristics of Sensor Systems in Emergency Management, and the methodology of requirements engineering for such a framework.

Bojan Božić, Mert Gençtürk, Refiz Duro, Yildiray Kabak, Gerald Schimak

Context Ontology Modelling for Improving Situation Awareness and Crowd Evacuation from Confined Spaces

Crowd evacuation management at large venues such as airports, stadiums, cruise ships or metro stations requires the deployment and access to a Common Operational Picture (COP) of the venue, with real-time intelligent contextual interpretation of crowd behaviour. Large CCTV and sensor network feeds all provide important but heterogeneous observations about crowd safety at the venue of interest. Hence, these observations must be critically analyzed and interpreted for supporting security managers of crowd safety at venues. Specifically, the large volume of the generated observations needs to be interpreted in context of the venue operational grounds, crowd-gathering event times and the knowledge on crowd expected behaviour. In this paper, a new context ontology modelling approach is introduced. It is based on knowledge about venue background information, expected crowd behaviours and their manifested features of observations. The aim is to improve situation awareness about crowd safety in crisis management and decision-support.

Gianluca Correndo, Banafshe Arbab-Zavar, Zlatko Zlatev, Zoheir A. Sabeur

Reconstructing the Carbon Dioxide Absorption Patterns of World Oceans Using a Feed-Forward Neural Network: Software Implementation and Employment Techniques

Oceans play a major role in the global carbon budget, absorbing approximately 27% of anthropogenic carbon dioxide (CO2). As the degree to which an ocean can serve as a carbon sink is determined by the partial pressure of CO2 in the surface water, it is critical to obtain an accurate estimate of the spatial distributions of CO2 and its temporal variation on a global scale. However, this is extremely challenging due to insufficient measurements, large seasonal variability, and short spatial de-correlation scales. This paper presents an open source software package that implements a feed-forward neural network and a back-propagation training algorithm to solve a problem with one output variable and a large number of training patterns. We discuss the employment of the neural network for global ocean CO2 mapping.

Jiye Zeng, Hideaki Nakajima, Yukihiro Nojiri, Shin-ichiro Nakaoka

Three Levels of R Language Involvement in Global Monitoring Plan Warehouse Architecture

Three different options for involving R statistical software in the infrastructure of the data warehouse and visualization tool of the Global Monitoring Plan for persistent organic pollutants are presented, all differing in their demands with respect to data transfer rates, numbers of concurrently connected users, total amounts of data transferred, and the possibilities of repeating statistical calculations within a short period. After the development stage, two of these options were used at different levels of the system, demonstrating the specificity of their use and enabling the deployment of the powerful features of R statistical software by a system created using conventional programming languages.

Jiří Kalina, Richard Hůlek, Jana Borůvkova, Jiří Jarkovský, Jana Klánová, Ladislav Dušek

Process Design Patterns in Emergency Management

Emergency management is a discipline of dealing with and avoiding risks. In case of any emergency the immediate and fast intervention is necessary. It is possible only because of well-prepared contingency plans and adequate software support. Therefore, the aim of this paper is to focus on the common characteristics of process modelling within emergency management and to define the design patterns that are typical of this area. These Design Patterns enable faster and simpler generation of emergency processes and contingency plans as well as subsequent software support. They result from the current documentation and legislation. Each design pattern does not represent a final emergency process, but only a certain structure which is necessary to further customize according to current user requirements. Specifically, 16 design patterns have been identified and described on more management levels. The present form of design patterns is a result of the consolidation of many real processes of emergency management that arose and were verified within several research projects or set of directed interviews with emergency management experts.

Tomáš Ludík, Tomáš Pitner

Analytics and Visualization

Advanced Data Analytics and Visualisation for the Management of Human Perception of Safety and Security in Urban Spaces

The genesis of this work began during the DESURBS project. The scope of the project was to help build a collaborative decision-support system portal where spatial planning professionals could learn about designing much more secure and safer spaces in urban areas. The portal achieved this via integrating a number of tools under a common, simple to use, interface. However, the deficiencies in the project became apparent with subsequent development. Many of the open data employed changed format while applications were increasingly custom built for a single dataset. In order to overcome this a system called KnowDS was redesigned. The essence of the new design includes decoupling acquisition, analysis and overall presentation of data components. The acquisition component was designed to snap-shot the “data providing methods” and query data provenance in a similar way to a source code repository. The analysis component is built under a number of modular tools with a common interface which allows analysis to build in a plug&play approach. Finally, the data presentation component is where the custom logic goes. Under such design approach, the building of future applications becomes less challenging. As a consequence, two case studies using the new framework were considered. Firstly, a UK crime web-browser which allows data analytics performances at various granularities of crime types while correlating crimes across various UK cities has been achieved. Secondly, a mobile application which enables to generate reports on citizens’ perception of safety in urban zones has also been developed. The two applications were efficiently built under the new design framework; and they clearly demonstrate the capacity of the new system while they actively generate new knowledge about safety in urban spaces.

Panos Melas, Gianluca Correndo, Lee Middleton, Zoheir A. Sabeur

Combined Aggregation and Column Generation for Land-Use Trade-Off Optimisation

In this paper we developed a combination of aggregation-disaggregation technique with the concept of column generation to solve a large scale LP problem originating from land use management in the Australian agricultural sector. The problem is to optimally allocate the most profitable land use activities including agriculture, carbon sequestration, environmental planting, bio-fuel, bio-energy, etc., and is constrained to satisfy some food demand considerations and expansion policies for each year from 2013 to 2050. In this research we produce a higher resolution solution by dividing Australia’s agricultural areas into square kilometer cells, which leads to more than thirteen million cells to be assigned, totally or partially, to different activities. By accepting a scenario on agricultural products’ return, carbon related activities, future energy prices, water availability, global climate change, etc. a linear programming problem is composed for each year. However, even by using a state of the art commercial LP solver it takes a long time to find an optimal solution for one year. Therefore, it is almost impossible to think about simultaneous scenarios to be incorporated, as the corresponding model will become even larger. Based on the properties of the problem, such as similar economical and geographical properties of nearby land parcels, the combination of clustering ideas with column generation to decompose the large problem into smaller sub-problems yields a computationally efficient algorithm for the large scale problem.

Asef Nazari, Andreas Ernst, Simon Dunstall, Brett Bryan, Jeff Connor, Martin Nolan, Florian Stock

A Software Package for Automated Partitioning of Catchments

This paper reports about a software package which has been developed to automatically partition hydrological networks (catchments) into clusters of similar size. Such clustering is useful for parallel simulation of catchments on distributed computing systems and is typically based on heuristic graph algorithms.

There have been a few approaches to automatically partition catchments, but literature research indicates that there seems to be no systematic investigation of the usefulness of different graph algorithms for catchment partitioning over a reasonable number of real world data sets. Our study aims at making a step in this direction.

The paper describes the software package, which has been implemented in Java, its pluggable architecture, and initial experiments using the European catchment dataset ECRINS. The paper presents work in progress.

Ralf Denzer, Tobias Kalmes, Udo Gauer

Understanding Connectivity between Groundwater Chemistry Data and Geological Stratigraphy via 3D Sub-surface Visualization and Analysis

This paper describes the 3D Water Chemistry Atlas - an open source, Web-based system that enables the three-dimensional (3D) sub-surface visualization of ground water monitoring data, overlaid on the local geological model. Following a review of existing technologies, the system adopts Cesium (an open source Web-based 3D mapping and visualization interface) together with a PostGreSQL/PostGIS database, for the technical architecture. In addition a range of the search, filtering, browse and analysis tools were developed that enable users to interactively explore the groundwater monitoring data and interpret it spatially and temporally relative to the local geological formations and aquifers via the Cesium interface. The result is an integrated 3D visualization system that enables environmental managers and regulators to assess groundwater conditions, identify inconsistencies in the data, manage impacts and risks and make more informed decisions about activities such as coal seam gas extraction, waste water extraction and re-use.

Jane Hunter, Andre Gebers, Lucy Reading, Sue Vink

Distributed Minimum Temperature Prediction Using Mixtures of Gaussian Processes

Minimum temperature predictions are required for agricultural producers in order to assess the magnitude of potential frost events. Several regression models can be used for the estimation problem at a single location but one common problem is the amount of required data for training, testing and validation. Nowadays, sensor networks can be used to gather environmental data from multiple locations. In order to alleviate the amount of data needed to model a single site, we can combine information from the different sources and then estimate the performance of the estimator using hold-out test sites. A mixture of Gaussian Processes (MGP) model is proposed for the distributed estimation problem and an efficient Hybrid Monte Carlo approach is also proposed for the estimation of the model parameters.

Sergio Hernández, Philip Sallis

A Framework for Optimal Assessment of Planning Investments in Urban Water Systems

The functionality expected by governments and citizens from urban water management systems (UWMS) has evolved in time from delivering basic services to enabling complex issues such as healthy ecosystems, environmental sustainability, and economic growth. Alongside these changing expectations, the pressure on policy makers to fulfill disparate performance metrics with diminishing resources is increasing. In this paper, we introduce the Water Assets and Infrastructure Network Decision Support (WAND) framework for modelling and planning urban water systems with the aim of assisting medium to long-term investment decisions. The framework combines economic performance measures with liveabililty and sustainability, and uses optimisation as a crucial component. To demonstrate its feasibility, we present a prototype environmental software system built upon the principles of legibility, adherence to engineering conventions, and extensive unit testing. We use the prototype system to analyse a model of a hypothetical urban water system formed by two coupled water networks, one for freshwater and one for stormwater collection, and which handles six different commodities. Our results suggest to planners the optimal combination of planning investments while considering capacities, service levels and network operating conditions.

Rodolfo García-Flores, Magnus Moglia, David Marlow

Measuring and Benchmarking Corporate Environmental Performance

Corporate environmental performance is discussed in this paper. The aim of the paper is to propose a framework for an environmental performance benchmarking model. Corporate environmental performance is measured by key performance indicators (KPIs): Emissions of Greenhouse Gases, Water Consumption, Waste Production and Gross Value Added. Performance is benchmarked against the production frontier estimated by Data Envelopment Analysis. The environmental performance benchmarking model was created and tested on real corporate data. The model determines relative corporate environmental performance, identifies weaknesses in performance and quantifies performance gaps.

Marie Pavláková Dočekalová, Alena Kocmanová, Jana Hornungová

GeneralBlock: A C++ Program for Identifying and Analyzing Rock Blocks Formed by Finite-Sized Fractures

GeneralBlock is a software tool for identifying and analyzing rock blocks formed by finite-sized fractures. It was developed in C++ with a friendly user interface, and can analyze the blocks of a complex-shaped modeling domain, such as slopes, tunnels, underground caverns, or their combinations. The heterogeneity of materials was taken fully into account. Both the rocks and the fractures can be heterogeneous. The program can either accept deterministic fractures obtained from a field survey, or generate random fractures by stochastic modeling. The program identifies all of the blocks formed by the excavations and the fractures, classifies the blocks, and outputs a result table that shows the type, volume, factor of safety, sliding fractures, sliding force, friction force, cohesion force, and so on for each block. It also displays three-dimensional (3D) graphics of the blocks. With GeneralBlock, rock anchors and anchor cables can be designed with the visual assistance of 3D graphics of blocks and the excavation. The anchor, cables, and blocks are shown within the same window of 3D graphics. The spatial relationship between the blocks and the anchors and cables is thus very clear.

Lu Xia, Qingchun Yu, Youhua Chen, Maohua Li, Guofu Xue, Deji Chen

On the Volume of Geo-referenced Tweets and Their Relationship to Events Relevant for Migration Tracking

Migration is a major challenge for the European Union, resulting in early preparedness being an imperative for target states and their stakeholders such as border police forces. This preparedness is necessary for multiple reasons, including the provision of adequate search and rescue measures. To support preparedness, there is a need for early indicators for detection of developing migratory push-factors related to imminent migration flows. To address this need, we have investigated the daily number of geo-referenced Tweets in three regions of Ukraine and the whole of Japan from August 2014 until October 2014. This analysis was done by using the data handling tool Ubicity. Additionally, we have identified days when relevant natural, civil or political events took place in order to identify possible event triggered changes of the daily number of Tweets. In all the examined Ukrainian regions a considerable increase in the number of daily Tweets was observed for the election day of a new parliament. Furthermore, we identified a significant decrease in the number of daily Tweets for the Crimea for the whole examined period which could be related to the political changes that took place. The natural disasters identified in Japan do not show a clear relationship with the changes in the degree of use of the social media tool Twitter. The results are a good basis to use communication patterns as future key indicator for migration analysis.

Georg Neubauer, Hermann Huber, Armin Vogl, Bettina Jager, Alexander Preinerstorfer, Stefan Schirnhofer, Gerald Schimak, Denis Havlik

Benchmarking Systems and Methods for Environmental Performance Models

Many business activities and procedures influence the environment. This environmental impact has to be assessed. We consider companies where the procedure of measuring environmental performance is applied through an environmental management system. The benchmark methodology has been developed precisely for these companies. This benchmark methodology can be used in the initial assessment as a screening method in sectors of different economic activities. The paper surveys the methodology that is designated for the environmental performance assessment of companies in the food-processing sector and introduces the architecture of the Web environmental benchmarking and reporting system that has been also developed.

Zuzana Chvátalová, Jiří Hřebíček, Oldřich Trenz

High Performance Computing and BigData

Scalability of Global 0.25° Ocean Simulations Using MOM

We investigate the scalability of global 0.25° resolution ocean-sea ice simulations using the Modular Ocean Model (MOM). We focus on two major platforms, hosted at the National Computational Infrastructure (NCI) National Facility: an x86-based PRIMERGY cluster with InfiniBand interconnects, and a SPARC-based FX10 system using the Tofu interconnect. We show that such models produce efficient, scalable results on both platforms up to 960 CPUs. Speeds are notably faster on Raijin when either hyperthreading or fewer cores per node are used. We also show that the ocean submodel scales up to 1920 CPUs with negligible loss of efficiency, but the sea ice and coupler components quickly become inefficient and represent substantial bottlenecks in future scalability. Our results show that both platforms offer sufficient performance for future scientific research, and highlight to the challenges for future scalability and optimization.

Marshall Ward, Yuanyuan Zhang

A Performance Assessment of the Unified Model

The Unified Model (UM) is a model produced by the UK MetOffice for Numerical Weather Prediction (NWP) and climate simulation. It is used extensively by various university, government and other research organizations on the large supercomputer hosted at the National Computing Infrastructure (NCI). A 3-year collaboration between NCI, the Australian Bureau of Meteorology and Fujitsu is underway to address performance and scalability issues in the UM on NCI’s supercomputer, Raijin.

IO performance in the UM is the most dominant factor in its overall performance. The IO server approach employed is sophisticated and requires proper calibration to achieve acceptable performance. Global synchronization and file lock contention is a problem that can be remedied with simple MPI global collective calls. Complimentary IO strategies, such as MPI-IO and directed IO, are being investigated for implementation.

The OpenMP implementation employed in the UM is investigated, and is found to have inefficiencies that are detrimental to the load balance of the model. Only loop-wise parallelism is employed. Due to the inherently imbalanced nature of the model, a task-wise approach could yield improved threading efficiency.

Dale Roberts, Mark Cheeseman

The Czech e-Infrastructure and the European Grid Infrastructure Perspective

National e-Infrastructures are playing an increasingly important role in the support of complex computational and data requirements from all scientific disciplines, environmental informatics not excepting. Since 1996, such an e-Infrastructure is developed and operated in the Czech Republic, with its emphasis shifting from a shared uniform distributed infrastructure to a more user-tailored environment. Its development relates (and in some cases precedes) the evolution of European Grid Infrastructure (EGI), with its current vision of an Open Science Commons concept. While the current e-Infrastructure concept and its implementation is of a very generic nature, it can be tailored to specifically cover different needs of environmental applications. This paper gives an overview of the Czech national e-Infrastructure, its connection to EGI and a number of applications from environmental science domains.

Ludek Matyska

The NCI High Performance Computing and High Performance Data Platform to Support the Analysis of Petascale Environmental Data Collections

The National Computational Infrastructure (NCI) at the Australian National University (ANU) has co-located a priority set of over 10 PetaBytes (PBytes) of national data collections within a HPC research facility. The facility provides an integrated high-performance computational and storage platform, or a High Performance Data (HPD) platform, to serve and analyse the massive amounts of data across the spectrum of environmental collections – in particular from the climate, environmental and geoscientific domains. The data is managed in concert with the government agencies, major academic research communities and collaborating overseas organisations. By co-locating the vast data collections with high performance computing environments and harmonising these large valuable data assets, new opportunities have arisen for Data-Intensive interdisciplinary science at scales and resolutions not hitherto possible.

Ben Evans, Lesley Wyborn, Tim Pugh, Chris Allen, Joseph Antony, Kashif Gohar, David Porter, Jon Smillie, Claire Trenham, Jingbo Wang, Alex Ip, Gavin Bell

Big Data Architecture for Environmental Analytics

This paper aims to develop big data based knowledge recommendation framework architecture for sustainable precision agricultural decision support system using Computational Intelligence (Machine Learning Analytics) and Semantic Web Technology (Ontological Knowledge Representation). Capturing domain knowledge about agricultural processes, understanding about soil, climatic condition based harvesting optimization and undocumented farmers’ valuable experiences are essential requirements to develop a suitable system. Architecture to integrate data and knowledge from various heterogeneous data sources, combined with domain knowledge captured from the agricultural industry has been proposed. The proposed architecture suitability for heterogeneous big data integration has been examined for various environmental analytics based decision support case studies.

Ritaban Dutta, Cecil Li, Daniel Smith, Aruneema Das, Jagannath Aryal

A Performance Study of Applications in the Australian Community Climate and Earth System Simulator

A 3-year investigation is underway into the performance of applications used in the Australian Community Climate and Earth System Simulator on the petascale supercomputer Raijin hosted at the National Computational Infrastructure. Several applications have been identified as candidates for this investigation including the UK MetOffice’s Unified Model (UM) atmospheric model and Princeton University’s Modular Ocean Model (MOM). In this paper we present initial results of the investigation of the performance and scalability of UM and MOM on Raijin. We also present initial results of a performance study on the data assimilation package (VAR) developed by the UK MetOffice and used by the Australian Bureau of Meteorology in its operational weather forecasting suite. Further investigation and optimization is envisioned for each application investigated and will be discussed.

Mark Cheeseman, Ben Evans, Dale Roberts, Marshall Ward

A New Approach for Coupled Regional Climate Modeling Using More than 10,000 Cores

This paper describes an alternative method for coupling atmosphere-ocean regional climate models that communicates momentum, radiation, heat and moisture fluxes between the atmosphere and ocean every time-step, while scaling to more than 10,000 cores. The approach is based on the reversibly staggered grid, which possesses excellent dispersive properties for modeling the geophysical fluid dynamics of both the atmosphere and the ocean. Since a common reversibly staggered grid can be used for both atmosphere and ocean models, we can eliminate the coupling overhead associated with message passing and improve simulation timings. We have constructed a prototype of a reversibly staggered, atmosphere-ocean coupled regional climate model based on the Conformal Cubic Atmospheric Model, which employs a global variable resolution cube-based grid to model the regional climate without lateral boundary conditions. With some optimization, the single precision, semi-implicit, semi-Lagrangian prototype model achieved 5 simulation years per day at a global 13 km resolution using 13,824 cores. This result is competitive with state-of-the-art Global Climate Models than can use more than 100,000 cores for comparable timings, making CCAM well suited for regional modeling.

Marcus Thatcher, John McGregor, Martin Dix, Jack Katzfey

Backmatter

Weitere Informationen

Premium Partner

    Bildnachweise