Skip to main content
Erschienen in:
Buchtitelbild

Open Access 2021 | OriginalPaper | Buchkapitel

21. Finding What You Need: A Guide to Citizen Science Guidelines

verfasst von : Francisco Sanz García, Maite Pelacho, Tim Woods, Dilek Fraisl, Linda See, Mordechai (Muki) Haklay, Rosa Arias

Erschienen in: The Science of Citizen Science

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

In line with the growth in citizen science projects and participants, there are an increasing number of guidelines on different aspects of citizen science (e.g. specific concepts and methodologies; data management; and project implementation) pitched at different levels of experience and expertise. However, it is not always easy for practitioners to know which is the most suitable guideline for their needs. This chapter presents a general classification of guidelines, illustrating and analysing examples of each type. Drawing on the EU-Citizen.Science project, we outline criteria for categorising guidelines to enable users to find the right one and to ensure that guidelines reach their intended audience. We discuss challenges and weaknesses around the use and creation of guidelines and, as a practical conclusion, provide a set of recommendations to consider when creating guidelines.

Introduction

This chapter is not new guidance about how to ‘do’ citizen science; rather, it is a guide to the guidelines, explaining why and how guidelines should be classified. We mainly focus on citizen science guidelines provided online and in English, while noting that there are others (e.g. print only, in other languages). Furthermore, we consider guidelines that refer not only to scientific methodologies but also to the implementation and maintenance of citizen science projects.
The chapter is aimed at anyone involved in citizen science projects, as well as those planning to produce their own guidelines. We start with a review of existing guidelines, which draws on our experience of participating in and/or managing citizen science projects. As well as elaborating on what makes guidelines useful, we also aim to facilitate greater access to them. We go on to outline how to ensure guidelines can be made more findable through the use of metadata. To conclude, we consider some of the challenges of using and creating guidelines and weaknesses in the existing selection, which point the way to improvements and subjects that should be covered in future iterations.

The Value and Diversity of Guidelines in Citizen Science

Citizen science is a rapidly growing field, with numerous projects around the world (EC 2019). These are having widespread impacts in diverse areas, such as social media (Bautista-Puig et al. 2019), environmental policy (Turbé et al. 2019), and many scientific fields (Follett and Strezov 2015; Kullenberg and Kasperowski 2016; Bautista-Puig et al. 2019). This diversity is also evident in terms of the methodologies used, project duration, the number and type of participants, the range of activities undertaken, the levels of documentation, and the degrees of achievement.
However, with such diversity, knowledge can be forgotten, overlooked, or lost in the enormity of the Internet, because projects lack a powerful, flexible tool to correctly index them and make them visible. One way to counter this, and capture the knowledge and lessons emanating from citizen science projects, is to document these lessons and knowledge in guidelines (see Box 21.1).
Box 21.1: What Is a Guideline?
Guideline: Information intended to advise people on how something should be done or what something should be. Cambridge Dictionary (n.d.)
Guidelines are one way in which we capture and share information, and they have been written for numerous subjects. Generally, they are less discursive documents and more directive: What to do (and why); how (best) to do it; and, in many cases, step-by-step instructions that can be followed or adapted.
Several guidelines exist on different aspects of citizen science. These cover, among other subjects, how to set up a project; how to attract, motivate, and retain participants; how to store and manage data; and how to influence policy. They take many forms, such as training manuals, case studies, best practice examples, and how-to guides. And they can be presented in different media: books and printed materials, online materials, videos, and audio broadcasts.
But what are guidelines for? By referring to a guideline, people involved in citizen science projects – whether professional scientists, project managers, or participants – can learn from the lessons, mistakes, and experiences of others. In this way, they can avoid wasting time and resources by repeating common mistakes; they can decide how best to allocate resources; and they can locate ideas and inspiration for how to improve their activities, even if already successful.
However, it is not always easy for people to know which guideline is the most suitable for their needs (Skarlatidou et al. 2019), nor will it solve every problem. For example, knowing how past projects have successfully recruited citizen scientists does not mean this can be replicated in other projects.
Despite this, guidelines – whether conceptual or methodological – are an essential component of successful citizen science (see Box 21.2 for some good examples). It is always important to gain an understanding of the fundamentals of the scientific process, such as defining the problem and the right methodology to address it. Referring to relevant guidelines can support this and should therefore be an initial undertaking for any project – one repeated throughout its duration.
For those whose projects have provided insights that could benefit others, guidelines can be a communication tool to share, among other aspects, the project methodology, its lessons, and its successes.
Box 21.2: Good Examples of Citizen Science Guidelines
One leading example of a citizen science guideline is Choosing and Using Citizen Science: A Guide to When and How to Use Citizen Science to Monitor Biodiversity and the Environment by Pocock et al. (2014). This was prepared for the staff of the Scottish Environmental Protection Agency (SEPA) to understand when and how to use citizen science. The guideline provides an introduction to citizen science in environmental monitoring, along with advantages and disadvantages of using this approach. At its core is a decision framework on where and how to choose a citizen science methodology (that is, which issues are suitable and which are not).
Further best practice examples can be found in the many local, national, and international citizen science platforms. For example, the Spanish platform Ciencia Ciudadana en España hosts more than 20 educational guidelines on citizen science, many of which can be used in both formal and non-formal education.
In the international field, the selection of guidelines carried out by the Doing It Together Science (DITOs) project, currently available on the ECSA website,1 should be highlighted. These show the relationship between resources used in the responsible research and innovation approach. Among these resources, is the advice paper from the League of European Research Universities (LERU), which is addressed to scientists and institutions who want to incorporate or foster citizen science methodologies (Wyler et al. 2016). Other noteworthy examples include the two guidelines edited by the Citizen Science Foundation of Chile,2 which are particularly useful for the Latin American community.

Historical Citizen Science Guidelines

In citizen science, guidelines have likely existed for almost as long as the field itself. Yet there are two main difficulties in locating them. Firstly, the term ‘citizen science’ is not useful for identifying the thousands of activities throughout history that match this concept. Secondly, the term ‘guidelines’ does not sufficiently cover the many tools that fulfil the recording function in these activities. Conducting such an extensive review of historical citizen science is beyond the scope of this chapter, but we offer some representative historical examples.
One significant example comes from the history of meteorology in the USA. The ‘meteorological crusade’ took place between 1834 and 1859 and sought to explain the causes of storms, their phenomenology, and the most appropriate research methodologies. The ensuing scientific conflict led to the development of an observational project by the Smithsonian Institution, together with the American Philosophical Society, the Franklin Institute, the Army Medical Department, and the Navy Department. In 1848, the American System of Voluntary Observers in Meteorology was founded under the direction of Joseph Henry, a renowned physicist and a member of the Smithsonian Institution. The number of project participants reached 600 observers in 1860. As Millikan (1997, p. 15) notes, ‘the volunteers mailed monthly reports that included several observations per day of temperature, barometric pressure, humidity, wind and cloud conditions, and precipitation amounts’. For accurate measurement by the volunteer observers, ‘the Smithsonian Institution provided standardized instruments, uniform procedures, free publications, and a sense of scientific unity that went far beyond the normal scope of local universities and academic societies’ (Fleming 1997, n.p.) – an early example of a citizen science guideline.
Also in the nineteenth century, ocean science was consolidated through diverse collaborations, in this case between the British Admiralty, the scientific community, and the maritime community (Reidy 2008). The history of the ‘great tidal experiment’, conducted under the coordination of William Whewell in 1835, is an emblematic scientific project: it was carried out simultaneously by thousands of people from nine countries and colonies at more than 650 stations on both sides of the Atlantic. Whewell’s instructions for recording measurements were undoubtedly intended for use by all those who, with or without previous training, participated in the great experiment. In the document Memoranda and Directions for Tide Observations (1833), there are instructions for how to measure the tides, either continually in one place or at different observation stations. As such, it can be considered one of the earliest historical examples of citizen science guidelines. The sixth series of Whewell’s philosophical transactions contains methodologies as well as the results of the experiment and, while initially addressed to the professional scientific community, was also consulted by ship captains who, together with members of their crews, continued to report measurements for later experiments (Washington 1842).
Guidelines on methodologies, and the collection and recording of data of diverse types and scopes, have an even longer history. This includes activities originating from an authority (academy, government, etc.) that requested the population to collect data. Clavero and Revilla (2014, p. 15) state that diverse ‘historical data on biodiversity have been widely collected over several hundred years through initiatives that today would be described as citizen science’. They illustrate this statement with an example from 1575, when the Spanish government distributed questionnaires, known as topographical relations, in which the more informed inhabitants of each locality had to provide a compilation of their local knowledge, in particular about its natural heritage. Clavero and Revilla (2014, p. 15) detail that ‘the 637 questionnaires that are preserved include information on some 190 species of wild animals and plants, collected in more than 4,300 individual records’. The questionnaires – reproduced in Campos y Fernández de Sevilla (2003) – also included information on economic, cultural, geographical, and legal issues, among many others. The letters written by King Philip II, which ordered the creation of a catalogue together with the questionnaires, constitute a meticulous guideline for the registration of these data. Similar historical data sets exist in China and in most European countries and their former colonies (Clavero and Revilla 2014).

A Guide to the Guidelines

The Need to Classify Guidelines

Beyond academic research, there are two main reasons for classifying guidelines. Firstly, there are numerous guidelines available, but it is not always easy for citizen science practitioners and project managers to find the right resource when many cover similar themes and practices. Secondly, people looking for a guideline on citizen science often have diverse needs and interests, which can be specific (e.g. for a niche branch of citizen science). A thorough system of classification helps people to find the right guideline for their needs and increases the accessibility and usefulness of guidelines. The next step for classifying citizen science guidelines will be according to the metadata (the information about the guidelines).

Types of Guidelines

Establishing an exhaustive classification of all types of citizen science guidelines is beyond the scope of this chapter, but it is worth distinguishing between the two main types of guidelines. Firstly, those that refer to general aspects of citizen science; these can be defined as lessons learned for use across a range of citizen science projects. Secondly, there are guidelines focused on specific citizen science projects. Each type of guideline can be subdivided between those that cover all aspects of citizen science and those that focus on certain (one or more, but not all) aspects. Table 21.1 outlines this classification system, along with examples to illustrate it.
Table 21.1
Classification of types of citizen science guidelines
Type
Subtype
Topic
Examples
General – covering aspects common to all citizen science projects
All aspects
 
Citizen Science for All. A Guide for Citizen Science Practitioners (GEWISS Consortium 2016)
Ciencia ciudadana: Principios, herramientas, proyectos de Medio Ambiente (Acevedo 2018)
Focused on certain aspects
Ethics
Managing Intellectual Property Rights in Citizen Science. A Guide for Researchers and Citizen Scientists (Scassa and Chung 2015)
Communication
A Guide to Communications in Citizen Science (Veeckman et al. 2019)
Data
Guiding Principles for Public Engagement (UCL n.d.).
Methodology
Data Management Principles (UKEOF n.d.)
Citizen’s Guide to Open Data (Geothink 2016)
‘Statistical Solutions for Error and Bias in Global Citizen Science Datasets’ (Bird et al. 2014)
Specific – related to one project
All aspects
 
Artportalen Basic Principles (Artportalen n.d.)
Manual de Usuario/a de Biodiversidad Virtual (Biodiversidad Virtual Manual) (Biodiversidad Virtual 2014)
Focused on certain aspects
Ethics
Debian Code of Conduct (Debian n.d.)
Código ético de la fotografía en la naturaleza (Ethical Code) (Biodiversidad Virtual n.d.)
Methodology
Tutorial on the protocol for capturing an insect properly (Mosquito Alert n.d.).
Celebrate Urban Birds (Cornell Lab n.d.)
Unidad didáctica OdourCollect (Ibercivis 2019)
Science
Tutorial on Galaxy Zoo (Galaxy Zoo n.d.)

General Guidelines: All Aspects

Although citizen science is an increasingly diverse field, general guidelines on citizen science continue to be produced. One reason is that the concept of citizen science needs to be clarified, as there is not (yet) a widely agreed definition, and, arguably, such a definition is not necessarily useful (see Haklay et al. 2020, this volume). General guidelines address issues common to almost any project, such as those detailed below.
Although most citizen science resources are written in English, two good examples of general guidelines are in German and Spanish. Citizen Science für alle – Eine Handreichung für Citizen Science-Beteiligte (GEWISS Consortium 2016) was written for a German-speaking audience in Germany, Austria, and Switzerland but has been translated into English and is equally useful in many other geographic locations. It describes how citizen science can be used in different fields, such as education, conservation, and the arts and humanities, and targets a wide range of end users from scientists to society-based groups to NGOs. It also provides a relevant set of further resources and references: many of these are only available in German, but some are also accessible in English.
Similarly, Ciencia ciudadana: Principios, herramientas, proyectos de Medio Ambiente (Acevedo 2018) can be considered a general guideline, even though its title makes explicit reference to the environment. The guideline includes a wide variety of tools and resources applicable in any field. It is also a practical text, with abundant examples and methodological tools based on the Canvas model, (which helps users to establish a logical relationship between all the components of a project and the variables that influence its success). At the same time, it embodies the ECSA 10 Principles of Citizen Science (ECSA 2015), adapting them to the Chilean context.

General Guidelines: Focused on Certain Aspects

Many guidelines refer to one or more of the many aspects that are relevant to citizen science projects, including ethics, communication, data collection, data analysis and management, scientific methodology, and funding. The depth of coverage is also varied, ranging within the same topic from academic studies to introductory guides. Managing Intellectual Property Rights in Citizen Science: A Guide for Researchers and Citizen Scientists (Scassa and Chung 2015) presents an extensive study of ethical issues in citizen science as a guideline for both researchers and citizens. In their research, they discuss issues of copyright, patents, trademarks, and trade secret laws, as well as the protection of traditional knowledge, all in the context of citizen science. They also include examples of best practices in the field.
Although not presented as a guideline (but fulfilling the functions of one), the academic paper ‘Statistical Solutions for Error and Bias in Global Citizen Science Datasets’, by Bird et al. (2014), describes different methods for reducing bias and error in citizen science databases. The authors conclude that issues related to error and bias found in citizen science data are similar to those found in other large-scale databases and outline some of the tools available to combat them. For example, they explain some statistical approaches used in ecological contexts that are available in free software packages.

Specific Guidelines: All Aspects

Many projects produce comprehensive documents that gather all the relevant information about their activities, for example, as manuals for users or participants or final project reports. Comprehensive project websites can also fulfil the function of a project guideline.
Biodiversidad Virtual’s website is one example. Not only it is an open system for users to search for and record sightings (e.g. of plants, fungi, animals, landscapes); it also provides instructions for how to do so and an ethical code for interacting with the environment, thereby acting as a guideline for users as well as other projects. Furthermore, the website stimulates interest in, and increases public understanding of, conservation measures, leading to greater efficiency in conservation efforts and a self-sustainable community.

Specific Guidelines: Focused on Certain Aspects

There are countless guidelines in this category, which is unsurprising given the enormous number of citizen science projects around the world and the many aspects that need to be addressed in each. For example, guidelines to species recognition are sometimes grouped and/or focused on a defined region (e.g. deciduous trees in Europe, day butterflies in New England); while others are even more specific, such as guidelines and tutorials about how to recognise individuals of a particular genus or species. Their prevalence will only increase as citizen science is applied to a growing number of scientific fields.
The Cornell Lab has a long history of citizen science, and one of its flagship projects, Celebrate Urban Birds, has produced bird identification guidelines that seek to make citizen science more inclusive. They are available online and offline, in English and Spanish, making them accessible to a large number of people and communities in the American continent.

Navigating the Diversity of Citizen Science Guidelines

There is a huge diversity of guidelines available, for example, around a particular subject or methodology, but, to the best of our knowledge, there has not been an exhaustive academic review or evaluation of their production and use. As their number continues to grow, in line with the increasing number and diversity of citizen science projects, research of this nature would improve the quality of future guidelines. It would also help to ensure that they address gaps in the field and meet the needs and demands of those working in the field (see Box 21.3).
Box 21.3: Guidelines for Environmental Digital Projects
Skarlatidou et al. (2019) noted that in the field of environmental digital projects, ‘hundreds of citizen science applications exist, [but] there is a lack of detailed analysis of volunteers’ needs and requirements, common usability mistakes and the kinds of user experiences that citizen science applications generate’ (p. 1). This suggests that there is a need for further citizen science guidelines to be developed. To address a systematic review of articles related to user issues in environmental digital citizen science, the authors provide a set of design guidelines, assessing them by means of cooperative evaluation. In particular, they seek to ‘assist scientists and practitioners with the design and development of easy to use citizen science applications’ (Skarlatidou et al. 2019, p. 1).

Metadata

The problem of how to find material is not exclusive to citizen science. It is intrinsic to the nature of the Internet and will undoubtedly be one of the main challenges in the coming years. In an effort to tackle this, the World Wide Web Consortium (W3C) is promoting the semantic web, an extension of the existing web that aims to be a new standard for data formats and data exchange online. This concept, conceived by Tim Berners-Lee, was adopted by more than 4 million domains in 2013. With the semantic web, it is possible to indicate, in a way that search engines can understand, what we are talking about at every moment. As an example, citizen science projects can indicate their duration, number of participants, research branch, et cetera; while citizen science guidelines can indicate their target end users, types of licence, et cetera. So, with a simple search, someone can see all the citizen science projects executed in 2016 in Austria that involved 100–150 participants or all the guidelines in English about citizen monitoring of freshwater resources.
The task of classifying projects and guidelines is not easy, however. First, it is necessary to define the ontology with which to mark them. In addition, the web pages of citizen science projects and guidelines need to adopt the extension www3 by including it in their code. Ideally, metadata should also be defined and created in a consensual way by those that participate in citizen science and those that produce its guidelines. Metadata must also be consistent with Internet standards, such as those provided on Schema.​org.

An Approach to Classifying Guidelines: EU-Citizen.Science

An initiative to classify the existing array of guidelines for citizen science has been taking place in the EU-Citizen.Science project. This project has received funding from the European Union’s Horizon 2020 Framework Programme for Research and Innovation. Its ambition is to build, fill, and promote a sustainable platform and mutual learning space providing different tools, best practice examples, and relevant scientific outcomes that are collected, curated, and made accessible to different stakeholders – ranging from interested citizens to scientific institutions, politicians, and the media – in order to mainstream citizen science in Europe.
Repositories exist for citizen science resources, including guidelines (see Box 21.2). However, they have some limitations, for example, infrequent or no updates, unclear selection criteria, difficulty in understanding their usefulness for the community, et cetera. To address these gaps, the EU-Citizen.Science project aims to develop criteria to define and identify (1) citizen science resources and best practices and (2) the relevant criteria used to select them. This section summarises the approach adopted by EU-Citizen.Science to identify high-quality citizen science resources, including guidelines, for the EU-Citizen.Science platform.

Initial Steps

Once the process of classifying citizen science resources began, the EU-Citizen.Science project partners realised that the definition of citizen science resources and how to categorise them was challenging. This challenge was further complicated by the fact that these resources include, but are not limited to, sensors, software, apps, guidelines, websites, podcasts, videos, figures, diagrams, publications, reports, et cetera. To address this challenge, a simple categorisation was chosen for resources, namely, (1) tools, (2) guidelines, (3) training resources, and (4) other materials. As part of this process, guidelines were defined as ‘a set of rules and instructions that could be helpful in designing, implementing or evaluating citizen science initiatives or initiatives relevant to citizen science’ (Fraisl et al. 2020, p. 22).
Based on the understanding that guidelines are created for use by a diverse community, two classification processes were established: (1) a top-down approach to establish criteria for building a repository of resources – including guidelines – relevant to citizen science and (2) a more democratic, bottom-up approach to allow users to collaborate in the process of guideline creation, selection, and inclusion.

A Top-Down Approach to Defining Criteria for Resources Relevant to Citizen Science

Based on a SWOT (strengths, weaknesses, opportunities, and threats) analysis, the project partners decided to (1) determine overarching criteria applicable to all categories of resource; (2) identify a set of specific and supporting criteria for determining ‘good-quality’ resources; and (3) implement a rating system that helps the community decide which resources are most useful and to provide feedback on them. This approach was agreed to be the best way to address the wide-ranging needs and expectations of the platform’s diverse target audiences. Note that some guidelines may prove useful for one particular case or a target group but not in other contexts. Therefore, this process requires community ownership to facilitate and encourage user input in order to make it sustainable and dynamic.

Overarching Criteria for Citizen Science Resources

The EU-Citizen.Science project partners identified three overarching criteria that are applicable to all categories of resources, including guidelines. These are described below.
Criterion 1 (Required): The Resource Is About or Relevant to Citizen Science
There are guidelines created specifically for citizen science, covering themes such as initiating and maintaining a project (e.g. GEWISS Consortium 2016); designing research (e.g. Mindell et al. 2017); engaging citizens (e.g. Wald et al. 2016); and evaluating the outcome of citizen science initiatives (e.g. Illinois Library 2019). There are also guidelines not created specifically for citizen science but which can still be useful, such as the Guiding Principles for Public Engagement (UCL n.d.).
This criterion requires consensus from the moderators of the EU-Citizen.Science platform and the global citizen science community regarding what constitutes citizen science. The ECSA 10 Principles of Citizen Science (ECSA 2015) have already been well received and adopted. However, they remain generic and can be interpreted in different ways. Therefore, the EU-Citizen.Science consortium is currently identifying a set of characteristics for citizen science through an inclusive approach, including perspectives from the global citizen science community and other fields (Haklay et al. 2020), which will inform the guidelines classification process.
Criterion 2 (Required): The Resource Includes a Standard Set of Metadata
Robust metadata will help to establish a standardised approach to classifying and searching for citizen science resources. The EU-Citizen.Science project partners, following the standards from Schema.​org for DigitalDocument,3 identified a set of metadata that can be applied to guidelines and other resources to increase accessibility and usefulness, listed in Table 21.2.
Table 21.2
Proposed metadata for classifying citizen science guidelines and other resources
Basic information (required fields in bold)
Description
About
The subject matter of the content
Abstract
A short description that summarises the guideline
aggregateRating
The overall rating, based on a collection of reviews or ratings
Audience
An intended audience, i.e. a group for whom something was created
Author
The author of the content or rating
datePublished
Date of first publication
inLanguage
The language of the guideline
Keywords
Keywords or tags used to describe the guideline
License
A license document that applies to the content, typically indicated by URL
Publisher
The publisher of the guideline
Image
An image of the guideline
Name
The name or title of the guideline
Url
The URL of the guideline
Criterion 3 (Suggested): The Resource Engages with the ECSA 10 Principles of Citizen Science
By using adherence to these principles as an important criterion, any resource – including a guideline – is encouraged to align with the general ideas embodied therein. This criterion is suggested instead of required, because the ten principles may not be applicable to all resources; for example, water quality monitoring equipment might be useful for citizen science but does not necessarily need to engage with the ten principles. The complexity of a resource is also a factor, since a repository on citizen science can include numerous resources, which makes it difficult and time-consuming to check them all.

Specific and Supporting Criteria for Citizen Science Resources

After applying the overarching criteria, each resource then needs to be assessed against relevant specific and supporting criteria. There are nine specific criteria or questions that have been identified and agreed by the project partners.4 The characteristics of each resource will be considered when deciding which of the nine criteria are relevant. These questions include: is a resource easy to access? is it clearly structured? and is it written using clear language, considering the intended users? The answers to these questions range from ‘strongly agree’ to ‘strongly disagree’ on a five-point scale. If the total rating exceeds the threshold, it will be listed on the platform.
Relevant supporting criteria include two evaluation and two impact-related criteria. The supporting criteria include questions to help moderators decide whether a resource should be listed on the platform, but the answers are not included in the rating system. Instead, the moderator is encouraged to use them to strengthen their argument on whether the resource should be included on the platform.

A Bottom-Up Approach to Classifying Guidelines

While the project partners agreed on the need to implement a framework that defines quality, they also agreed on the importance of avoiding the application of strict criteria. The purpose of this exercise is not to be exclusive, while at the same time applying certain standards. Different resources can be helpful in specific contexts, and thus the quality of a resource can be context dependent and subjective.
In light of this, there is also a need for the EU-Citizen.Science platform to allow the citizen science community to define the resources that are useful for them and add these to the platform, as well as to provide feedback on the resources already there. As part of this bottom-up approach to classifying guidelines, participants can therefore upload their own guidelines and other useful resources to the EU-Citizen.Science platform, using the same criteria identified by project partners. They can also rate the existing resources on the platform based on their own experiences and leave detailed comments on the challenges or opportunities of using a particular guideline or any other resource.

Next Steps

The project partners hope that these criteria are useful for other citizen science platforms in Europe and beyond and should be used as a starting point for future classification processes. However, the EU-Citizen.Science project also recognises the limitations of such an approach. It is not easy to reach a consensus on a process that is inclusive and bottom-up but which also meets the needs and expectations of different target groups of the platform. The project partners are at the stage of implementing this process and agreeing on the curated resources on the platform. It is important to highlight that this process will continue to be improved over the project’s lifetime and beyond, considering the dynamic nature of citizen science.

Challenges in the Use of Guidelines

Despite their many uses, guidelines for citizen science are not a ‘silver bullet’ that can fix every problem a project or practitioner may encounter. For example, even if you identify a guideline for your chosen field (e.g. freshwater monitoring), the advice and instructions may not be replicable in, or adaptable to, all contexts: they may be too costly or subject to limitations. Some recommendations in a guideline may not be implementable in certain locations due to cultural reasons. Further, while citizen science is a strong advocate for open access online, some guidelines may be inaccessible to some due to the costs of downloading or buying them. It is beyond the scope of this chapter to discuss all the challenges in using guidelines. Instead, we describe two of the common challenges, with suggestions for how to mitigate them.

The Need to Revise and Update Guidelines

Guidelines are usually static and all too often created in a top-down manner. This is, to an extent, inherent in their creation and purpose. They should be written from a position of experience and expertise and will often be based on the collated lessons from a number of projects, brought together in one ‘final say’ on a subject or practice – one that should contain a degree of sustainability and longevity.
Yet there are few guidelines that cannot be updated by feedback from new or other experts in the field or improved with feedback from those who have put their recommendations into practice. As the digital age progresses and channels for communication become ever more intertwined with our lives, can we imagine a form of living guideline, one that is constantly being tweaked and revised and adjusted?
On the one hand, such an iterative, cyclical process might lead to citizen science guidelines that are always up to date, representing the current thinking and recommendations on an issue or process. On the other hand, it could lead to a situation in which users are never sure how long term the recommendations in a guideline are, leading to a loss in their credibility and usefulness.
It seems unlikely that guidelines will move to a fully iterative format (i.e. a wiki); there will remain a need for an authoritative statement on many citizen science practices. However, we advocate for all citizen science guidelines to provide an option for users to provide feedback, both on the guideline itself and their experiences in putting its recommendations into practice. Further, in many cases, guidelines should come from, or be co-created with, nonacademic communities (in terms of their content, format, language, etc.), so that their perspectives and expertise are represented in the growth and development of citizen science as a practice. As with citizen science projects, cooperation between different stakeholders is necessary and will help to ensure that guidelines remain true to the core citizen science principles of inclusivity and openness.

Language

Many citizen science guidelines are in English. This represents a barrier to those who are not fluent in English or lack the resources to translate guidelines into their own language(s). Language can also be a limitation when searching online: the best guidelines may not come up in search results if they are not in the same language as the search. Platforms and repositories such as EU-Citizen.Science have an important role to play in minimising this barrier. By curating a collection of carefully selected guidelines, they point users towards a peer-reviewed selection, thus narrowing the scope of their search and suggesting possible resources that can, if required and feasible, be translated.
The style of language used – the complexity, terminology, and tone – is another factor influencing which is the most appropriate guideline for a particular individual or project. This can also be an issue for those creating guidelines: by trying to make their texts as accessible as possible, they may lose some of the accuracy, detail, or nuance of the information they are sharing. One way to counter this is to elaborate the guideline with specific audience(s) in mind and provide links within it to documentation that explores specific issues more comprehensively (these should be open access, to ensure everyone can read them). The examples we have presented in this chapter are some that we consider as striking the right balance between complexity and accessibility, although this will of course vary among users and projects.
According to the current trend, citizen science is likely to continue to grow, in terms of the number of projects and participants, as well as the range of topics it covers. As the field expands, so will the number of guidelines devoted to different aspects of it. These should play a central role in ensuring that citizen science strengthens as it develops, through the sharing of best practices and lessons learned, and in turn become increasingly valued as a source of data and expertise.
As the number and scope of citizen science guidelines also grows, it is increasingly important to ensure that they remain easy to use and easy to find. Without a widely used and interoperable system of classification, there is a risk of a crowded, overlapping field, a situation that will leave users confused and unable to find what they need. The criteria proposed in this chapter, building on research by the EU-Citizen.Science project, present one approach to classifying guidelines for citizen science. While acknowledging that this approach is as yet unproven, we urge practitioners in the field to apply it. We also propose the EU-Citizen.Science platform as a place to upload and showcase guidelines – although we note other platforms are also suitable places (and all guidelines can, of course, be linked to from more than one site).
In conclusion, and based on this initial research into citizen science guidelines, we provide a set of practical recommendations for practitioners to consider when creating future guidelines:
1.
Review what is out there to avoid recreating existing guidelines.
 
2.
Consider who your guidelines are for, and ensure they are targeted to the needs of those audiences.
 
3.
Ensure that lessons learned and instructions are provided in a format that can, where possible, be adapted to different sectors and contexts.
 
4.
Be honest about what worked well – and what did not work well.
 
5.
Use appropriate metadata to ensure that they can be found by those who will benefit from them.
 
6.
Be open to feedback and suggestions about how to improve or update your guidelines.
 
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://​creativecommons.​org/​licenses/​by/​4.​0/​), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Literatur
Zurück zum Zitat Bird, T. J., Bates, A. E., Lefcheck, J. S., Hill, N. A., Thomson, R. J., Edgar, G. J., et al. (2014). Statistical solutions for error and bias in global citizen science datasets. Biological Conservation, 173, 144–154.CrossRef Bird, T. J., Bates, A. E., Lefcheck, J. S., Hill, N. A., Thomson, R. J., Edgar, G. J., et al. (2014). Statistical solutions for error and bias in global citizen science datasets. Biological Conservation, 173, 144–154.CrossRef
Zurück zum Zitat Scassa, T., & Chung, H. (2015). Managing intellectual property rights in citizen science: A guide for researchers and citizen scientists. Washington, DC: Woodrow Wilson International Center for Scholars. Scassa, T., & Chung, H. (2015). Managing intellectual property rights in citizen science: A guide for researchers and citizen scientists. Washington, DC: Woodrow Wilson International Center for Scholars.
Zurück zum Zitat Veeckman, C., Talboom, S., Gijsel, L., Devoghel, H., & Duerinckx, A. (2019). Communication in citizen science. A practical guide to communication and engagement in citizen science. Leuven: SCIVIL. Veeckman, C., Talboom, S., Gijsel, L., Devoghel, H., & Duerinckx, A. (2019). Communication in citizen science. A practical guide to communication and engagement in citizen science. Leuven: SCIVIL.
Zurück zum Zitat Campos y Fernández de Sevilla, F. J. (2003). Las relaciones topográficas de Felipe II. índices, fuentes y bibliografía. Anuario jurídico y económico escurialense, 36, 439–574. ISSN 1133-3677. Campos y Fernández de Sevilla, F. J. (2003). Las relaciones topográficas de Felipe II. índices, fuentes y bibliografía. Anuario jurídico y económico escurialense, 36, 439–574. ISSN 1133-3677.
Zurück zum Zitat Follett, R., & Strezov, V. (2015). An analysis of citizen science based research: Usage and publication patterns. PLoS One, 10(11), e0143687.CrossRef Follett, R., & Strezov, V. (2015). An analysis of citizen science based research: Usage and publication patterns. PLoS One, 10(11), e0143687.CrossRef
Zurück zum Zitat Fraisl, D., Hager, G., & See, L. (2020). Framework report describing criteria and rationale for sharing and selecting state of the art citizen science resources. D3.1 of the EU-Citizen.Science project funded under the European Union’s Horizon 2020 research and innovation programme GA No: 824580. https://zenodo.org/record/3716236#.Xt4JIkUzZPZ. Accessed 8 June 2020. Fraisl, D., Hager, G., & See, L. (2020). Framework report describing criteria and rationale for sharing and selecting state of the art citizen science resources. D3.1 of the EU-Citizen.Science project funded under the European Union’s Horizon 2020 research and innovation programme GA No: 824580. https://​zenodo.​org/​record/​3716236#.​Xt4JIkUzZPZ. Accessed 8 June 2020.
Zurück zum Zitat Kullenberg, C., & Kasperowski, D. (2016). What is citizen science? – A scientometric meta-analysis. PLoS One, 11(1), e0147152.CrossRef Kullenberg, C., & Kasperowski, D. (2016). What is citizen science? – A scientometric meta-analysis. PLoS One, 11(1), e0147152.CrossRef
Zurück zum Zitat Reidy, M. S. (2008). Tides of history: Ocean science and her Majesty’s navy. Chicago: University of Chicago Press.CrossRef Reidy, M. S. (2008). Tides of history: Ocean science and her Majesty’s navy. Chicago: University of Chicago Press.CrossRef
Metadaten
Titel
Finding What You Need: A Guide to Citizen Science Guidelines
verfasst von
Francisco Sanz García
Maite Pelacho
Tim Woods
Dilek Fraisl
Linda See
Mordechai (Muki) Haklay
Rosa Arias
Copyright-Jahr
2021
Verlag
Springer International Publishing
DOI
https://doi.org/10.1007/978-3-030-58278-4_21