The precautionary principle has often been described as an extreme principle that neglects science and stifles innovation. However, such an interpretation has no support in the official definitions of the principle that have been adopted by the European Union and by the signatories of international treaties on environmental protection. In these documents, the precautionary principle is a guideline specifying how to deal with certain types of scientific uncertainty. In this contribution, this approach to the precautionary principle is explicated with the help of concepts from the philosophy of science and comparisons with general notions of practical rationality. Three major problems in its application are discussed, and it is concluded that to serve its purpose, the precautionary principle has to (1) be combined with other decision principles in cases with competing top priorities, (2) be based on the current state of science, which requires procedures for scientific updates, and (3) exclude potential dangers whose plausibility is too low to trigger meaningful precautionary action.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
No other safety principle has been so vehemently contested as the precautionary principle. It has repeatedly been accused of being both irrational and unscientific, [1, 2], and numerous authors have claimed that is stifles innovation by imposing unreasonable demands on the safety of new technologies [3‐5]. Judging by these accounts, the precautionary principle would seem to be rather extreme. But are these descriptions accurate? In order to answer that question, we need to pay close attention to how the precautionary principle is defined and conceived by those who have the legislative power to apply it. “The Precautionary Principle in Official Documents” delineates how the precautionary principle is defined in official documents. The picture that emerges differs radically from the negative descriptions just referred to. “A Science-based Principle” is devoted to a philosophical explication of the principle, as it is presented in these documents. In “The First Problem: Competing Top Priorities,” “The Second Problem: The Need for Scientific Updates,” and “The Third Problem: Excluding Too Implausible Dangers,” three major problems in the application of the precautionary principle are discussed, and in the final “Conclusion” some conclusions are offered on the effects of applying the principle and on the limitations on its use.
The Precautionary Principle in Official Documents
The precautionary principle is often taken to be a general instruction to be cautious, much like the maxim “better safe than sorry,” with which it has often been equated.1 However, that is not how the precautionary principle is presented in official documents. There, it is described as a principle with a much more limited scope, namely a principle for the evaluation of uncertain or incomplete scientific evidence.2
Precautionary thinking can be traced back many centuries (, p. 26), but the idea of a specific precautionary principle grew out of national and international discussions on environmental policies in the 1980s. A decisive step towards its acceptance was taken when the “precautionary concept found its way into international law and policy as a result of German proposals made to the International North Sea Ministerial Conferences” (, p. 4). The declaration from the Second International Conference on Protection of the North Sea in 1987 was the first major international document in which a “principle” of precaution was promulgated. It was called “the principle of precautionary action,” and meant that “in order to protect the North Sea from possibly damaging effects of the most dangerous substances, a precautionary approach is necessary which may require action to control inputs of such substances even before a causal link has been established by absolutely clear scientific evidence” . Another early statement can be found in the British Government’s White Paper This Common Inheritance from 1990, which promoted “precautionary action to limit the use of potentially dangerous materials or the spread of potentially dangerous pollutants, even where scientific knowledge is not conclusive, if the balance of likely costs and benefits justifies it” (cit. , p. 197).
Perhaps the most influential international proclamation of the principle can be found in the Rio Declaration on Environment and Development, which was adopted at the Rio de Janeiro Earth Summit (Rio Conference) in June 1992:
Principle 15. Precautionary principleIn order to protect the environment, the precautionary approach shall be widely applied by States according to their capabilities. Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation. 
In the same year, the European Union incorporated the precautionary principle into its legislative framework. This was done in the 1992 Maastricht Amendments to the European Treaty (Treaty of Rome, now known as the Treaty on the Functioning of the European Union) (, p. 206):
Union policy on the environment shall aim at a high level of protection taking into account the diversity of situations in the various regions of the Union. It shall be based on the precautionary principle and on the principles that preventive action should be taken, that environmental damage should as a priority be rectified at source and that the polluter should pay .
A Communication on the Precautionary Principle was published by the European Commission in February 2000. While acknowledging that the treaty only mentions the precautionary principle in connection with environmental protection, the communication asserts that “in practice, its scope is much wider, and specifically where preliminary objective scientific evaluation, indicates that there are reasonable grounds for concern that the potentially dangerous effects on the environment, human, animal or plant health may be inconsistent with the high level of protection chosen for the Community” . A court decision in 2002 confirmed this interpretation and made it clear that the precautionary principle should be considered to be a general principle of European law (, pp. 110–111 and 549–550).
The Communication from 2000 described the principle as an approach to risk management, which “should not be confused with the element of caution that scientists apply in their assessment of scientific data.” The use of scientific information in risk management is strongly emphasized, and the use of the precautionary principle is essentially restricted to decisions under scientific uncertainty. When applying the principle, one should “start with a scientific evaluation, as complete as possible, and where possible, identifying at each stage the degree of scientific uncertainty.” Furthermore, the following six requirements are imposed on applications of the principle:
Where action is deemed necessary, measures based on the precautionary principle should be, inter alia:
- proportional to the chosen level of protection,
- non-discriminatory in their application,
- consistent with similar measures already taken,
- based on an examination of the potential benefits and costs of action or lack of action (including, where appropriate and feasible, an economic cost/benefit analysis),
- subject to review, in the light of new scientific data, and
- capable of assigning responsibility for producing the scientific evidence necessary for a more comprehensive risk assessment. 
The European Commission’s White Paper for a future chemicals policy, published in February 2001, provides additional clarifications on the precautionary principle. The planned legislation was intended to be “in line with the overriding goal of sustainable development and seek to make the chemical industry accept more responsibility by respecting the precautionary principle and safeguarding the Single Market and the competitiveness of European industry.” The precautionary principle demands that “action must be taken even if there is still scientific uncertainty as to the precise nature of the risks” . The legislation referred to here was adopted in 2006 under the name Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH). It requires precautionary measures not only against chemicals that are known to be dangerous but also against substances for which there is insufficient but non-negligible scientific evidence of a danger to human health or the environment .
Three important conclusions can be drawn from these official definitions and explanations of the precautionary principle. First, the principle refers specifically to the evaluation of uncertain evidence for decision-making purposes, which is only one of several types of cautious reasoning that we may apply when making decisions. Thus, it is not a general principle of cautiousness or “better safe than sorry.” Secondly, it is an injunction to take preventive action not only against known dangers but also against potential dangers for which there is only insufficient evidence. Thirdly, the indications of danger that it enjoins us to take seriously are those that are provided by science. Thus, it does not recommend actions based on suppositions or fears that lack support in science.3 The official documents leave no doubt that the precautionary principle is intended to be science-based in this sense.
In public debates, the term “precautionary principle” is often used with a different meaning. In particular, environmental groups often use the term about rules of cautiousness that are more far-reaching than the legally defined precautionary principle, and support more extensive environmental measures. These rules of precaution are stronger than the legally defined principle, in the informal and somewhat vague sense of “stronger” as “demanding or supporting more far-reaching counter-measures.” (In the literature, reference is often made to “strong” and “weak” precautionary principles, but these terms are defined in many different ways.4) Some of these stronger rules of precaution are also interesting objects of scholarly analysis.5 However, the official and legal use of important terms is always of special interest. For instance, the legal notion of theft is subject to focused studies, in which other usages, such as those according to which “property is theft” or “taxation is theft,” are left out of consideration. Mainly for similar reasons, the rest of this article is devoted to an analysis the precautionary principle as it is defined and expounded in international treaties and in European law. An additional reason for this approach is that, as we will see, a coherent and epistemically interesting conception of precaution can be extracted from these documents.6
The focus will be on how the precautionary principle, as defined and conceived in official documents, can be explicated and clarified. No attempt will be made at a systematic exploration of how it is applied in practical decision-making.7
A Science-based Principle
To clarify the meaning of such, science-based, precaution, we can use a simple model from the philosophy of science, which shows how scientific data is (ideally) processed and used, both for the purpose of scientific judgments and for that of practical decisions.8
Science has a long tradition of giving much priority to the avoidance of error. A new hypothesis or idea is only accepted if it is supported by convincing evidence. Consequently, the burden of producing such evidence has to be carried by those who put forward the new proposal. This means, for instance, that those who claim to have identified a new hazard or risk have to provide sufficient scientific evidence to convince their colleagues that they are right. For the workings of science, this is an appropriate assignment of the burden of proof. When a new claim is accepted as a scientific fact, future investigations will be based on it. If we accept false statements as scientific facts, then they can hamper scientific progress and lead research into a cul-de-sac. To avoid this, strict criteria of proof have to be applied to new scientific claims.
In science, nothing is accepted once and for all. There are of course scientific standpoints that we currently have no reason whatsoever to doubt, but this does not mean that it is impossible for such reasons to emerge in the future. Therefore, scientific statements should not be treated as definitely and irreversibly accepted. Instead, they should be seen as accepted provisionally, i.e., until reasons to doubt them become known. This provisionality combines with the continuous search for new information to make science self-correcting. This is a crucial mechanism for scientific progress.
The statements that are taken to be scientific facts — provisionally, until we have reasons to doubt and perhaps revise or reject them — form the scientific corpus, i.e., the body of all scientific claims that the scientific community currently holds to be true. See Fig. 1 (, pp. 15–17). Scientific knowledge derives from data that we obtain in experiments and other observations. Based on these data, we construct and critically assess more general scientific statements, including theories, which — if deemed tenable enough — are included in the scientific corpus. The corpus can be defined as the collection of all scientific standpoints that we presently have no reason to doubt.
Since the entry requirements for the corpus are rather strict, the information that it contains is usually reliable enough for the purpose of practical decisions. However, in some practical decisions, we have strong reasons to apply standards of proof that differ from those of science. In particular, there are cases when plausible suppositions that do not satisfy the criteria for inclusion in the corpus are nevertheless relevant for a practical decision.9 The following hypothetical example illustrates the typical structure of such cases:
The Baby Cream ExampleAn experiment has just been reported in which a product containing nanoparticles was mixed into fodder for pigs in order to increase their uptake of certain nutrients. It turned out that some of the pigs contracted liver cancer, and therefore this feed additive was not introduced for general use. The same type of nanoparticles is also a component of some moisturizing baby creams. A group of toxicologists was tasked with determining if the use of these nanoparticles in skin products has any negative health effects. In their report, they concluded that it was not known whether this use of the nanoparticles posed any danger. In the pigs that ingested them, the particles were absorbed into the bloodstream in the small intestine, and then transported in the blood to the liver and other organs. There were no indications that the particles could be absorbed through the skin. However, data had only been obtained for intact skin, and no information was available for skin affected by infections or other diseases. Upon receiving this report, the agency responsible for the safety of hygienic products had to make a decision based on uncertain information about a possible danger that might not exist.
In this example, a claim that the nanoparticles cause cancer in humans cannot be treated as a (provisional) scientific fact. Such a claim would be an uncertain supposition that is not part of the scientific corpus. But nevertheless, a reasonable case can be made that these nanoparticles should be removed from skin products, at least until more information about their properties has been obtained. In this and many other situations, we tend to act as if a danger exists, even though the scientific information does not amount to full scientific evidence that there is such a danger. Such measures are very much in line with the Rio declaration’s statement that “lack of full scientific certainty” should not be used as a reason not to act against “threats of serious or irreversible damage.” They also conform with the proclamation in the European Commission’s White Paper that “action must be taken even if there is still scientific uncertainty as to the precise nature of the risks.”
Figure 2 illustrates how scientific information can be used for policy-making purposes. Most commonly, information from the corpus is used (arrow 2). However, in order to avoid plausible but uncertain dangers, this may not be enough. In such cases, we need a direct path, a bypass route, to take us from data to policy (arrow 3). This is the crux of the precautionary principle; it allows us to use the bypass route in order to avoid possible dangers.
It is important to note that this bypass route has its starting-point in scientific data that give rise to a suspicion of danger. It cannot be accessed from an entry-point consisting of scientifically unsubstantiated fears or the whims of uninformed opinion. The following three principles have been proposed for science-based decisions employing the bypass route:
The same type of evidence should be taken into account in the policy process as in the formation of the scientific corpus. Policy decisions are not well served by the use of irrelevant data or the exclusion of relevant data.
The assessment of how strong the evidence is should be the same in the two processes.
The two processes may differ in the required level of evidence. It is a policy issue how much evidence is needed for various practical decisions. 
Practical decision-making that complies with these principles can be said to follow the tenets of science-based precaution. It can be seen from the official documents referred to above that this model corresponds closely to what is officially meant by the precautionary principle. In particular, public authorities endorsing the precautionary principle emphasize that it can only be triggered by scientifically valid indications of danger. For instance, the Swedish Chemicals Legislation from 1985 required a “reasonable scientific foundation” in order to trigger precautionary measures (, p. 23). According to the Communication from the European Commission in 2000, the principle is intended for cases with “reasonable grounds for concern.” The same document emphasizes that risk assessments have to be based on scientific information. We can conclude from this that the “reasonable grounds” referred to will have to be scientifically supportable .
Nanotechnology provides us with excellent examples both of potential dangers that have sufficient support to trigger the precautionary principle and of potential dangers that lack such support and therefore do not trigger it10:
The hypothesis that nanoparticles cause harm to humans is reasonable given what is known about asbestos and deserves further testing (it must be noted of course that we regularly breathe in nanoparticles without any apparent harm). It is plausible to believe that they might be harmful even though there is not enough evidence to even say that this is probable. It is less clear that gray goo presents a credible threat. For reasons mentioned earlier, there are serious doubts about whether self-replication of the type required is possible. If this is so, then an hypothesis such as ‘the development of self-replicating robots will lead to the gray goo problem,’ while perhaps true, is practically pointless given that the development of these robots is such a remote possibility. (, pp. 143–144)
The requirement that precaution should be science-based must be clearly distinguished from the preposterous but in some circles still popular idea that no action should be taken against a suspected danger unless there is full scientific evidence of its existence. The latter standpoint, which would exclude the by-pass route described above, was introduced in 1993 by the tobacco company Philip Morris. They initiated and funded an ostensibly independent organization called The Advancement of Sound Science Coalition (TASSC), whose major purpose was to sow doubts on the scientifically well-documented negative health effects of passive smoking. One of their means for doing so was to argue that no action should be taken against a potential danger until full scientific proof of its existence has been obtained . Needless to say, this is an irresponsible approach that is sure to invite disaster.
The precautionary principle has usually been put forward as a principle for decision-making in environmental and health-related issues. However, it is an expression of a much more general pattern of thought, namely that protective measures against a potential danger can be justified even if it is not known for sure that the danger exists. This way of thinking can be found in all areas of decision-making, also concerning other types of risk than those that affect human health and the environment. Military commanders do not passively wait for full evidence of a suspected enemy attack before taking counter-measures. Governments and central banks are expected to act against a potential financial crisis without knowing for sure that it will occur.11 A safety engineer will close an elevator for maintenance based on rather weak indications that its cables have been damaged, rather than wait for incontrovertible evidence that this is the case.
Why is the term “precautionary principle” only seldom used in these other areas where essentially the same pattern of thought is applied as in the protection of human health and the environment? The difference seems to be that no special principle is needed in most other areas, since the rationality of taking action in the absence of full evidence is seldom if ever contested. In contrast, preventive measures against potential harms to human health and the environment are often put to question on both ideological and interest-based grounds [46‐48]. This has created a need for a principled defense of such measures. In this perspective, the precautionary principle can be seen as the application of a common pattern of rational reasoning in certain areas where it is often contested.
Since the precautionary principle does not seem to be needed in other areas, one may well ask whether it can have any impact in the areas where it is mainly used, namely public health and environmental policy. As we have now seen, the “official” version of the principle does not prescribe what actions should be taken against the plausible but unproven dangers that it refers to. Instead, it just allows and approves counter-measures against such potential dangers. It has sometimes been maintained that since this (“weak”) version of the precautionary principle does not require any particular action against the potential dangers it refers to, it is forceless and only provides “somewhat innocuous or feeble additions to the regulatory landscape” (, p., 315). This argument, however, views the principle in isolation, not as part of a legal system that also contains other rules and regulations. As was noted in an authoritative commentary on the European legislation, the precautionary principle has to be interpreted in combination with provisions in that legislation stipulating that “the Community institutions are responsible, in all their spheres of activity, for the protection of public health, safety and the environment” (, p. 550). Against this background, it should be no surprise that the General Court of the European Union wrote in 2018:
The precautionary principle is a general principle of EU law requiring the authorities in question, in the particular context of the exercise of the powers conferred on them by the relevant rules, to take appropriate measures to prevent specific potential risks to public health, safety and the environment, by giving precedence to the requirements related to the protection of those interests over economic interests. (emphasis added) (, §109)
The quotation is taken from a decision in which the court upheld the Commission’s restrictions on several potentially harmful pesticides. Parts of these restrictions would have lacked legal support without the precautionary principle. A large number of historical examples have been documented in which scientifically plausible indications of danger have been excluded from consideration, often with disastrous results for human health and the environment [51, 52]. In many of these cases, timely application of the legal principles expounded in the 2018 decision by the General Court could have made a large difference.
In the next three sections, we are going to consider some of the potential problems that tend to accompany the application of the precautionary principle, in the version that can be found in international treaties and European legislation.
The First Problem: Competing Top Priorities
One of the most common criticisms against the precautionary principle is that it is said to assign an inflexible top priority to the protection of health and the environment. As we have seen, the principle as such does not have such implications. However, if combined with strict legislation requiring protection of human health and the environment, the effect may be a next-to-absolute requirement to take action against possible dangers. But this is not unique. Strong priorities to take actions against possible dangers are also set in other areas. For instance, safety engineering gives the highest priority to the avoidance of accidents, and military thinking to the avoidance of a successful enemy attack. The situation becomes much more complex when there is competition for the top priority. There are two major types of such priority conflicts, both of which can be accentuated by applications of the precautionary principle.
First, conflicts can arise if equally severe effects on health or the environment can plausibly be expected both if we take a particular course of action and if we refrain from taking it. This is often exemplified by reductions of pesticide use in the Global South. Pesticides, in particular insecticides, give rise to considerable environmental damage and to severe cases of occupational disease . However, a decision not to use these substances can lead to crop failure or to increased prevalence of diseases such as malaria. Several discussants have maintained that if the precautionary principle is applied to the risk of a famine or a malaria epidemic, then it will support a revocation of some pesticide bans [54‐56].
Another example of the same type of conflict is the weighing of the risks (side effects) of a medical treatment against the risks of refraining from the treatment. For instance, it is not uncommon for treatments of cancer to incur a risk of treatment-induced (“secondary”) cancer . Obviously, a treatment decision will have to be based on some sort of weighing of the risks of side effects against those of refraining from treatment. However, if there are risks of death on both sides of the balance, then the precautionary principle is not of much help. This is a major reason why the precautionary principle does not have a prominent role in clinical medicine. (It can be more useful in other medical contexts, for instance in decisions on preventive measures and on enhancement.)
The second type of problem arises when different types of effects compete for (or all have) the top priority. For instance, suppose that in a severe pandemic, a government assigns equally high priority to minimizing the fatalities due to the disease as to avoiding the negative economic effects of measures such as travel bans and social distancing that can reduce the death toll. When these two goals run into conflict, the precautionary principle will not be of much help in deciding what to do (unless it is combined with a decision to give one of these two objectives higher priority than the other).
Generally speaking, the precautionary principle loses its bite when risks of the same high priority are at conflict. Other decision-aiding principles may have to be applied in order to achieve adjudication or balance in such cases .
The Second Problem: The Need for Scientific Updates
As we saw in “The Precautionary Principle in Official Documents,” one of the six requirements on applications of the precautionary principle that were proclaimed in the European Commission’s Communication from 2000 is that such methods should be “subject to review, in the light of new scientific data.” Precautionary measures have to be “periodically reviewed in the light of scientific progress, and amended as necessary.” The precautionary principle should only be applied “so long as scientific information is incomplete or inconclusive” . This approach would seem to be rather uncontroversial, given that scientific knowledge is always subject to corrections, additions, and improvements. In practice, however, institutional inertia often makes it difficult to revoke or change a decision. This applies to decisions based on the precautionary principles as well as other types of decisions.
The regulation of genetically modified organisms provides a clear illustration of these difficulties.12 In July 1974, when this technology was at its very beginnings, 11 of the researchers working with it published a letter in Science, proposing that scientists should “voluntarily defer” two types of experiments with biologically active recombinant DNA molecules. The reason was “serious concern that some of these artificial recombinant DNA molecules could prove biologically hazardous” . A moratorium was in fact put into effect, and it was used by the researchers to perform a careful evaluation of the potential dangers of the new technology. At the Asilomar Conference on Recombinant DNA in February 1975, they concluded that these dangers were manageable. The moratorium was lifted, and experiments were resumed, applying safeguards that had been agreed upon.
Twenty years later, Paul Berg, one of the initiators of the moratorium, co-authored a retrospective paper on genetic modification. He and his co-author Maxine Singer observed that in the preceding two decades, the new technology had revolutionized biological science, and that it had done so without giving rise to any of the harmful effects that the pioneers had feared 20 years earlier:
Literally millions of experiments, many even inconceivable in 1975, have been carried out in the last 20 years without incident. No documented hazard to public health has been attributable to the applications of recombinant DNA technology. Moreover, the concern of some that moving DNA among species would breach customary breeding barriers and have profound effects on natural evolutionary processes has substantially disappeared as the science revealed that such exchanges occur in nature. 
In the quarter century that has passed since this article was published, our knowledge in genetics, plant biology, and ecology has increased dramatically. The uncertainties that justified the 1974 moratorium have been replaced by in-depth understanding of the technology, its mechanisms, and consequences . But nevertheless, the European Union and many other jurisdictions still have legislations on biotechnology whose fundamental principles are based on the level of scientific knowledge in the 1970s, and in particular on the uncertainties that then prevailed. This state of affairs has often been described as a consequence of the precautionary principle. However, as we have just seen, the European Commission’s own position paper makes it clear that decisions based on the precautionary principle should be “periodically reviewed in the light of scientific progress, and amended as necessary.” Therefore, the discrepancy between this legislation and the current status of scientific knowledge should not be seen as a consequence of the European Union applying its precautionary principle, but rather as a consequence of its failure to apply the principle in the way that is prescribed in its own official documents.13
A more general lesson can be learned from this. We need to take precautionary measures in cases of scientific uncertainty, but we also have to adjust these measures when uncertainty gives way to new scientific knowledge. Such adjustments can of course go in either direction: They can lead to more or less stringent protective measures, depending on the nature of the new information. But experience shows that it is often difficult to keep laws updated in pace with scientific and technological developments. A common solution to this is to provide laws with built-in mechanisms for adjustments to new knowledge, without the need for new decisions by the legislative body (, pp. 601–603). Obviously, this need for adaptability concerns not only the precautionary principle but also other legal rules whose application has to be sensitive to future developments in science and technology.
The Third Problem: Excluding Too Implausible Dangers
Some authors have claimed that the precautionary principle requires that even highly implausible suspicions of danger should lead to costly precautionary measures. For instance, Whelan  claimed that the principle requires that “we act on all the remote possibilities in identifying causes of human disease,” which lets “the distraction of purely hypothetical threats cause us to lose sight of the known or highly probable ones.” (For similar views, see: [65‐67].)14 This might seem to be a plausible interpretation. If we want to be on the safe side, what reason could there be not to consider all possible dangers?
In fact, there is such a reason, and indeed quite a compelling one. The reason is that such arguments can be constructed both for and against almost anything. For instance, think of some foodstuff that you eat. It is possible that it has some serious long-term health effect that scientists have not yet discovered. But the same applies to everything else that you eat. Therefore, the mere possibility that your favorite food can have negative health effects is not reason enough to refrain from consuming it. To be worth considering, an argument for doing so will have to show that there is a higher degree of plausibility than mere possibility.15
As this example shows, decisions based on the precautionary principle have to be triggered by considerations that have a higher degree of scientific credibility than mere possibilities. Consequently, such decisions cannot be triggered on any contention that someone chooses to make, without supporting scientific evidence. Such contentions are a real problem since they are a common modus operandi of science denialists and other pseudoscientists. One typical example is the completely unfounded claims that a common vaccine for children, namely the MMR vaccine (against measles, mumps, and rubella) gives rise to autism. The source of this claim is a retracted paper, which has been shown to be fraudulent [70‐72]. Competently performed epidemiological studies have shown no connection whatsoever between vaccination and autism [73‐75]. But nevertheless, anti-vaccination activists have persisted in claiming that there is a causal connection between vaccination and autism, or at least scientific uncertainty in the matter [76, 77]. According to their argumentation, since scientists cannot prove the absence of a connection with absolute certainty, it must be considered to be a real risk. On the face of it, this might look like a reasonable application of the precautionary principle.
Obviously, science cannot prove with absolute certainty that this vaccine will never, in any person, causally contribute to autism. However, the claims of the anti-vaccinationists can nevertheless be efficiently refuted. The crux of the matter is that in the same sense that the vaccine might contribute to autism, so might anything else that happens in a young person’s life: riding the merry-go-round, or playing with a skipping rope, or perhaps eating ice cream and strawberries. From a scientific point of view, these are all at least equally strong candidates as MMR vaccination for being causal factors in autism. (Arguably, they are stronger candidates, since there is no negative epidemiological evidence for any of them, as there is for the vaccine.) Furthermore, the alternative supposition that the vaccine protects against autism is no less plausible than the supposition that it gives rise to autism.16 For all these reasons, the claim that the MMR vaccine causes autism lacks the plausibility required for triggering the precautionary principle.
This example shows that in order to serve its purpose, the precautionary principle will have to be applied solely to potential risks with specific, scientifically tenable evidence. It cannot be applied to any unsubstantiated possibility of a risk that someone chooses to put focus on. To trigger the precautionary principle, a potential risk should, at the very minimum, have a scientific plausibility that is specific for this particular risk, above the plausibility level of “alternative” postulations that would lead us to act differently.
The common claims that the precautionary principle is irrational, goes against science, stifles innovation, etc. are based on interpretations of the principle that deviate drastically from the official interpretations in international treaties and in legislation and other binding documents adopted by the European Union. In its official versions, the precautionary principle is a guideline for the use of certain types of scientific evidence when making decisions. Importantly, it assumes that policy decisions should be based on science, and it does not leave room for decisions based on suppositions or fears that have no scientific backing. The basic message of the precautionary principle is that preventive measures can be justified by scientific evidence indicating a danger, even if that evidence is not sufficient to prove conclusively that the danger exists. This approach to uncertainty conforms with general principles of practical reasoning, and it can be explicated in detail with the help of a model of the scientific corpus and the science-policy interface. However, like other decision-making principles, the precautionary principle has its limitations. Based on an analysis of some problems arising in its use, we have identified three conditions that should be satisfied for an application of the precautionary principle to serve its purpose:
The precautionary principle cannot adjudicate between competing top priorities. In cases with such a priority structure, it may therefore have to be supplemented with decision principles suitable for weighing different potential outcomes against each other.
All precautionary actions should be based on the current state of science. Therefore, procedures for the scientific update of background information must be in place.
Potential dangers whose plausibility does not rise sufficiently above the level of “mere possibility” must be excluded from serious consideration.
I would like to thank Marko Ahteensuu, Per Sandin, Steffen Foss Hansen, and the editor-in-chief and referees of NanoEthics for valuable comments on an earlier version of this article.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.