Skip to main content
Erschienen in: NanoEthics 1/2020

Open Access 17.04.2020 | Original Research Paper

Noise and Synthetic Biology: How to Deal with Stochasticity?

verfasst von: Miguel Prado Casanova

Erschienen in: NanoEthics | Ausgabe 1/2020

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This paper explores the functional role of noise in synthetic biology and its relation to the concept of randomness. Ongoing developments in the field of synthetic biology are pursuing the re-organisation and control of biological components to make functional devices. This paper addresses the distinction between noise and randomness in reference to the functional relationships that each may play in the evolution of living and/or synthetic systems. The differentiation between noise and randomness in its constructive role, that is, between noise as a perturbation in routine behaviours and noise as a source of variability that cells may exploit, indicates the need for a clarification and rectification (whenever necessary) of the conflicting uses of the notion of noise in the studies of the so-called noise biology.
Hinweise

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Because they behaved with me at random, I will as well behave with them at random.
—Leviticus 26:40-411

Introduction

Information theory generally understands noise as the opposite of information—be this a physical magnitude or knowledge obtained from data. Nonetheless, noise is also understood as source of novelty and variation in the biological gene pool. Therefore, within synthetic biology research, the use of the term noise often refers to stochastic fluctuations which have a functional status. It is important to understand that the word “stochastic” does not entail that an entire cellular system behaves in an entirely random way; it stands for the impossibility of determining with absolute certainty how the system will evolve from a certain initial state. Even considering events that could be “more probable” than others (depending on the physico-chemical properties of the species involved) the global state of the system will always exhibit a certain degree of unpredictability. The disciplines of information theory, statistical thermodynamics and biochemistry offer sufficient evidence to assert that fluctuations in gene expression are inevitable in biological systems [2]; they are the consequence of the intrinsically stochastic nature of molecular interactions. Thus, it is not surprising that the expression levels of individual proteins are subject to random fluctuations over time.
Shannon’s traditional characterisation of information as the measure of the diminishment of uncertainty does not suffice for the richness of a “biotic system” ([3], p. 37) that propagates its organisation (or instructional information) by transforming free energy into work. DNA’s information is not riveted like Shannon’s information as the selection of message elements from a set—“selective information” in MacKay’s words [4]—but it is context-dependant like MacKay’s “structural information” [4]. This is in such a way that the same genotypes can bring about different phenotypes depending on the environment or context. Contemporary “teleosemantic” approaches to genetic information were introduced by Sterelny et al. [5], Maclaurin [6] and Maynard Smith [7]. These depart from the idea of genes as “carriers of a message”, a message which conveys a prescriptive or imperative content, in contraposition with an indicative or descriptive one. Their “direction of fit” to their goals is so unobjectionable that if the genes and the phenotype mismatch, what we find is an instance of unaccomplished instructions rather than imprecise descriptions.
This notion of information in biology (biotic information) understands that the constraints that make possible the propagation of organisation in a living organism stand for the information content of that organism [3]. Gene expression is a stochastic (or noisy) process, and gene regulation is decisive for adaptation and biological signals processing.
This paper explores the functional role of noise in synthetic biology and its relation to the concept of randomness. Ongoing developments in the field of synthetic biology are pursuing the re-organisation and control of biological components to make functional devices. This paper addresses the distinction between noise and randomness in reference to the functional relationships that each may play in the evolution of living and/or synthetic systems. The differentiation between noise and randomness in its constructive role, that is, between noise as a perturbation in routine behaviours and noise as a source of variability that cells may exploit, indicates the need for a clarification and rectification (whenever necessary) of the conflicting uses of the notion of noise in the studies of the so-called noise biology (e.g. [8, 9]), developmental noise (e.g. [1014]) and noise-induced and noise-oriented phenomena (e.g. [15]).
This paper will therefore argue that the investigation of the role of the concept of noise in synthetic biology should contain an account of both the structural resilience of a system to noise and an investigation of the functional integration of randomness. The response to this issue is relevant both to techniques used in synthetic biology and to how the field of synthetic biology conceptualises the functional dimensions of the systems that noise is altering or constructing. In the last decade there have been remarkable efforts challenging the problematic misconceptualisation of chance and noise in the form of random fluctuations and perturbations (e.g. [1618]; or [19]).2 In what follows, we will cover their progress as we need to provide another account of the phenomenon of noise as it enters a system in many different structural-functional configurations.

What We Are Talking About when We Talk About Noise and (Synthetic) Biology

Whenever matter is rearranged to create a new information structure, the introduction of an element of chance is needed. Without alternative possibilities, no new information is possible. It seems contradictory that noise, in the form of randomness, can be the paradigm source of variability (or new information), aligning its definition with low or negative entropy—the very opposite of our common understanding of noise as positive entropy.3 But systems are never sufficient to cancel the relative universality of what is not a system. The informational constraints and boundary conditions are together with noise co-determining and co-enabling biological systems. Since quantum level processes bring in noise, information stored may contain errors. When information is recalled, it is again exposed to noise and this may also corrupt the information content. Despite the constant presence of noise inherent to biological systems, these have accumulated and increased their consistent4 information content over billions of generations. In Monod’s words, noise is the “progenitor of evolution in the biosphere and accounts for its unrestricted liberty of creation, thanks to the replicative structure of DNA: that registry of chance, that tone-deaf conservatory where the noise is preserved along with the music.” ([21], pp. 116–117).
At first, synthetic biologists considered stochastic gene expression (or noise) an important obstacle to overcome. Nowadays, it has arguably become one of the main insights contributed by the discipline, since it reconstructs our comprehension of why, how and when specific genes are expressed. However, it is not clear yet how cells actually manage to deal with random outcomes in their expression, and how they achieve robustness. To what extent is noise expression tolerable? Is it just innocent of effect, or can it lead to adverse consequences? If cells can actually use their internal noise to cope with the external noise of an unpredictable environment [26], does it mean that cells have adapted (in the course of evolution) to cope with or (perhaps much more interesting) to take advantage of and to be optimised to function in the presence of stochastic fluctuations?

Case Studies on Noise and Synthetic Biology

The landmark characterisation of stochastic gene expression was carried out in the field of synthetic biology. In their experiments, researchers found noisy behaviour in gene expression, interfering with the operation of engineered genetic circuits. This is the case of one of the first practical examples of synthetic biology: the Repressilator [27]. The fluctuations they found involve non-linear feedback mechanisms that lead to complex behaviours. The Repressilator is a circular system of three genes, arranged in a feedback loop that results in oscillatory behaviour and in which products sequentially inhibit the expression of the next gene. Elowitz and Leibler discovered that the oscillations were ruled by marked fluctuations in their period and magnitude and hypothesised that stochastic behaviour in gene expression was responsible for these effects. It is important to add that the stochastic fluctuations found in the Repressilator were in fact unwanted perturbations, muddling deterministic behaviour. What is interesting is how it triggered the enquiry to modify the design of the Repressilator in order to achieve more robust behaviour. Particularly fascinating for the researchers was the question of whether the stochastic fluctuations they detected could also perform a functional role. In later research within synthetic biology, noise based on stochastic fluctuations gained a functional status [28]. In another experiment (unequivocally oriented towards the control of fluctuations) Becskei and Serrano [29] demonstrated that engineering a circuit with negative feedback could decrease cell-to-cell variability in expression. The process of pattern formation in living systems is also of capital interest to synthetic biologists attempting to develop living tissue in the laboratory. Synthesised tissues could have innumerable potential medical applications, but in order to engineer living tissues, researchers need to understand the genesis of pattern formation in living systems. Recently, Karig et al. [30] engineered bacteria that, when incubated and grown, exhibited stochastic Turing patterns. It is the first in vivo proof of the principle that patterns can be stabilised by noise [31]. Turing patterns can be spots, stripes or spirals that arise naturally in a species. In 1952, Turing’s groundbreaking paper “The chemical basis of morphogenesis” provided a theory for the formation of patterns in systems undergoing reaction and diffusion of their ingredients; this is so-called a reaction–diffusion theory of morphogenesis: stationary chemical patterns can be achieved from a system of two different interacting molecules (called morphogens) if they have specific characteristics [32]. One is an “activator”, which is autocatalytic and so introduces positive feedback. The other is an “inhibitor”, which represses the autocatalysis of the activator, and so enhances negative feedback. It is essential that they have different rates of diffusion: the inhibitor must be faster. Turing patterns were originally observed in some specific chemical reactions, but such patterns have proven very difficult to verify in biological systems. Goldenfeld explains that the problem with Turing’s mechanism is:
that it hinges on a criterion that isn’t satisfied in many biological systems, namely that the inhibitor must be able to move much more quickly than the activator. For example, if instead of chemicals, we were looking at two creatures in an ecosystem, like wolves and sheep, the wolves would need be able to move around much faster than the sheep to get classic Turing patterns. What this would look like, you would first see the sheep grow in number, feeding the wolves, which would then also grow in number. And the wolves would run around and contain the sheep, so that you would get little localized patches of sheep with the wolves on the outside. That’s essentially the mechanism in animal terms for what Turing discovered. [33]
In their recent research, Karig et al. [30] devised a theory of stochastic Turing patterns, wherein patterns develop from the noise of stochastic gene expression instead of relying on a high inhibitor–activator ratio. The researchers used synthetic biology to engineer bacteria, based on the activation–inhibition idea from Turing. They built a maximally exhaustive stochastic model of the process occurring in these synthetic pattern-forming gene circuits, and they established a comparison between the theoretical predictions with what the bioengineers observed in the petri dishes. Resorting again to the analogy of wolves and sheep, Goldenfeld addresses the issue of the speed difference between the activator and the inhibitor by asking:
what happens if there is only a small number of sheep, so that there are large fluctuations in population numbers? Now you get processes where sheep die at random. And we discovered, when you give birth to randomness, that actually drives the formation of stochastic Turing patterns.” […] “The theory of stochastic Turing patterns doesn’t require a great difference in speed between the prey and the predator, the activator and the inhibitor. They can be more or less the same, and you still get a pattern. But it won’t be a regular pattern. It’ll be disordered in some way. [33]
Turing patterns can in fact be achieved even in situations where you would not expect to be able to observe them, but they are disordered patterns—stochastic Turing patterns. Noise causes the formation of transient, stochastic Turing patterns for parameter values in which deterministic patterns do not form. In this case, it is the noise of stochastic gene expression that originated these patterns. These results show that Turing-type pattern-forming mechanisms, if driven by stochasticity, can potentially underlie a broad range of biological patterns. These experiments provide the groundwork for a unified portrait of biological morphogenesis, emerging from the compound of stochastic gene expression and dynamical instabilities.
Even though these different research projects confirmed that noise in gene expression is important and could even be controlled, the molecular basis for the perceived variability remained unclear. Elowitz et al. [34] and Ozbudak et al. [35] were the pioneers exploring the reasons behind stochastic gene expression.

Distinction Between Intrinsic/Extrinsic Noise and Stochastic Pulsing

Elowitz et al. introduced the concepts of extrinsic and intrinsic noise in gene expression (analysed mathematically by [36]). The overall variability in gene expression within an isogenic population (those characterised by substantially identical genes) is described by biological noise. The gene expression in these populations is not consistent from cell-to-cell, even in cases of populations with a stable average expression level or steady-state. This occurs because of the variations in “‘hardware’ units, such as transcriptional–translational machinery and regulatory molecules (resulting in extrinsic noise), as well as the inherent stochasticity attributed to the random nature of single-molecule kinetics (resulting in intrinsic noise).” ([37], p. 384). These are the two principal typologies of biological noise that have been defined within the realm of systems biology. These two kinds of noise highly enrich the phenotypic heterogeneity of genetically identical populations.
According to Elowitz et al. [34] extrinsic noise in gene expression is caused by cell-to-cell differences. These differences between cells, whether in local environment or in the concentration or activity of any condition that influences gene expression, will cause extrinsic noise. This entails the fluctuations in the volume or activity of molecules such as the proteins that influence the regions of DNA or the enzymes that synthesise DNA, which in turn produce subsequent fluctuations in the output of the gene. These fluctuations are regarded as sources of extrinsic noise that are global to a single cell but deviate from one cell to another. That is to say, extrinsic noise indicates the evidence that a cell is not an autonomous thing; it is ingrained in an organism and sustains links with it by integration and regulation mechanisms in various directions.
On the other hand, intrinsic noise refers to the stochastic fluctuations within the system being considered. Generally, they are the product of the inherently probabilistic nature of the underlying biochemical reactions. In other words, it is called intrinsic noise as it is originated from the very nature of elements of the systems and not from external disturbances. Determined by the structure, reaction rates, and species concentrations of the underlying biochemical networks, biological intrinsic noise is directly correlated to the expression of a single gene. The reason for this noise is the fact that all transcription and translation events have their origin in stochastic collisions between the components of the transcription and translation machinery of each gene. Thus, the same gene will almost never be expressed at the exact same time in two different cells.
Noise-dependence is also a key factor in the dynamic cell reactions to varying environmental conditions. Living organisms respond and react to changes in their environment. They do so by decoding the information contained in these changes; this entails stochastic pulses in activation and deactivation of regulatory factors within a population. Negative and positive feedback are characteristic kinds of regulation in genetic networks. Stochastic pulsing is the result of the interaction between the positive and negative feedback loops of such systems [38], where the negative feedback loop produces pulses and the positive feedback loop serves to amplify them. Additionally, the fluctuations arising out of noise appear to be an intrinsic attribute of gene expression, as can be noticeable in artificial cells made up of cell membrane-mimetic vesicles. Synthetic biology needs a proper understanding of cellular noise—considering that its goal is to engineer gene circuits with well-defined functional properties. We gain understanding about the regulatory mechanisms that tune biological noise in natural networks from the application of synthetic biology tools in the research of the diverse components of stochasticity, via the analysis, control and exploitation of biological noise (e.g. [26, 37]). If noise can be a positive, enhancing factor in a system’s robustness, this could support the design of innovative synthetic devices, with potential benefits in multiple fields around biotechnology.

The “Information Metaphor Falsehood” and the Glorification of Noise

Both the field of epistemology [28, 39, 40] as well as philosophy of biology [1618, 41] have recently raised concerns about the dominance of reductionism in the field of biology, and in particular biological engineering. Stressing as well a problem of nomenclature, when from these disciplines, we see examples of what is called noise that in fact might be randomness playing a positive role for an organism. This would conform to an image of biological phenomena which match up with physical explanations [18], so there is a reduction of the theories of the special sciences to fundamental physical theories. Epistemic reductionism would assume that even complex systems share the same basic processes which are mechanistic, and could be understood in terms of the behaviour of micro-physical entities.
In a gesture against this epistemic reductionism Perret and Longo (Ibid, p. 1) state:
[T]he adoption of information in biology is an erroneous transposition from a specific mathematical domain to one where it does not belong. Indeed, the mathematical framework of the information theory is too rigid and discrete to fit with biological phenomena. Therefore, information in biology represents an inappropriate metaphor.
They maintain that the breeding ground for our current Information Age is the theory of the elaboration of information (Turing-Kolmogorov) [42] and the theory of the transmission of information (Shannon-Brillouin) [43, 44], both based on computing discrete values, but wrongly assuming the “independence of the encoding from its material embodiment” ([18], p. 3). They state that there is no discrete informational value for any part of a biological system but, on the contrary, only a context-specific meaning. They elaborate a critique of the current trend in genetics which is characterised by a “central dogma” circumscribed by a “genocentric view of DNA” ([19], 43) which considers the process of gene expression as a unidirectional flow of information. This has the result that any other variables are understood and processed as noise, and any unpredictable outcomes are understood and processed as results of noise. Such a use of the concept in Longo’s view is an illegitimate and misleading overextension of the term “noise”. They find an example of this in Monod’s well known statement: “[F]rom a source of noise natural selection alone and unaided could have drawn all the music of the biosphere.” ([21], p. 118).
We cannot argue against the evidence that a computer (Turing) or a cable (Shannon) implies material determinations that differ fundamentally from the continuous dynamics that take place in the morphological constitution of a biological organism. Perret and Longo rightly warn that applying the mathematical framework of the information theory entails a theoretical account of Laplacian predictability “that opposes determination to noise and that is largely superseded, even in classical physics, by the modern theory of dynamical systems […]. [A]pplying information theory to biology is not free from the attitude that tries to reduce complex biological systems to deterministic systems” ([18], p. 5). However, the allegation of a scientifically erroneous exportation of theories of elaboration and transmission of information to biology (as a gesture of methodological reductionism) fails to acknowledge that it is not the case that applying the metaphor of computation/information to biology is wrong but what it is that the wrong image of computation/information that was applied. This is because what constitutes information is locally determined by the process of which its epistemic metarepresentation forms part. Information can never be meaningfully considered in isolation; it must always be seen in the context of its language processing system and the work module that this is in turn connected with (and this is the reason that for Shannon information is an inadequate measure of biological information). As Wilkins [19] explains, Perret and Longo argue that in opposition to reductionist and deterministic images of biological processes as composed of generic particles and discrete data points, we should make a case for the specificity of the material arrangement of living systems. Instead of understanding biological systems as “noise-immunised informational processes, the ‘default state’ of the living (on analogy with Galileo’s principle of inertia) should be understood as (random) variability” (Ibid, p. 43), and biological organisation as the sustainment and dissemination of materially specific constraints which render that randomness, so that any element of the system does not contain a discrete informational value but a context specific meaning.
According to Longo [16], a system is robust when it resists noise. This is particularly true of living systems, where randomness has a functional role that contributes (in an essential way) to the structural stability of system dynamics. Random mutation and copying errors in genetic replication have also been theorised as noise; however, Longo et al. argue that this kind of variability is so functional in biological evolution that describing it as noise is a spurious scientific characterisation. Longo understands that noise refers to small (and frequent) fluctuations (2018) in general, which may actually disturb the achieved stability of a biological system. He argues that we should not call these intrinsically random aspects of onto-phylogenesis [16] noise, but rather consider them indispensable components of stable biological complexity. For him, randomness is so intrinsic to the evolutionary stability of those systems that it can be argued there is no noise for such systems. Noise is recognised by Longo as an information-theoretic notion, totally unsuitable for theorising in the realm of biology. Moreover, he maintains that if there is a productive role for noise, we should replace the term “noise” with another concept that would also encapsulate randomness and model deviation as playing a functional role by stimulating variability and diversity. But is this use of the notion of randomness in the organisation of information an instance of the functionality of noise?
In order to proceed with this clarification, I will first develop an ultimately problematic idea initially articulated by René Thom and Robert Chumbley [20]: randomness and noise are relative to the specification of a scale and language for analysis. I argue that Thom’s understanding of noise as subjective or belonging only to the process of conceptualisation functions as a productive argumentative foil through which to rehabilitate noise as a concept applicable to certain cases emerging from (synthetic) biology. Thom’s approach, which stands in explicit contrast to Darwinism, thus remains fruitful if we understand it beyond the scope of his Laplacian Worldview.
Central to Thom’s argument is a critique of Darwin’s notion of “descent with modification” ([45], p. 171) for making “an illegitimate use of chance” (Thom 1994, p. 12). Darwin’s principle is premised on the “extreme sensitivity” ([45], Chapter 5) of biological dynamics to minor changes in external and internal conditions. For Darwin, random variation or noise is at the core of variability and diversity production in evolution, which makes selection and, actually, life possible (and understandable). Thom extended this critique of Darwin to Prigogine, Monod and other “fetishists” of noise. Thom’s counter-argument is that noise is in our process of conceptualisation. Moreover, he explicitly claims that intelligibility cannot include randomness as an intrinsic component of the analysis of a system’s dynamics. He maintains that “the signal-noise distinction is then fundamentally subjective” (1994, p. 20).

Where Is Noise Located?

I saw the earth in the Aleph and in the earth the Aleph once more and the earth in the Aleph… ([46], p. 151)
Thom’s position runs counter to the understanding of randomness and noise I wish to argue for. Contra Thom and Prigogine, Longo contends that randomness is neither in nature (Prigogine) nor exclusively in the theories that we use to talk about nature (Thom). Rather, Longo argues it is in the interface between our theoretical proposals and reality, which is whatever we may access by measurement and by measurement only [17]. That is, Longo considers that randomness appears as a result of measurement. Measurement is understood by him as the classical and quantum interface between our human computational models and the world, and is either treated as epistemic (typically, we expect causes for the fluctuations) or intrinsic to the theory—quantum mechanics contemplates some acausal phenomena found at measurement. According to this, Longo (personal communication, 2018) sees randomness located at the interface, where measurement, by various a priori principles grounding each theory, is either indeterminate or approximate.
Yet if noise and stochastic processes are closely linked together and all processes in nature are fundamentally stochastic [47], where this interface would credibly lie? Stochastic processes are frequently neglected in the macroscopic world due to the law of large numbers, which states that a given random operation, provided some initial constraints, will tend to even out towards an average result the higher its number of iterations. While this is understandable for systems at equilibrium, where the relative magnitude of fluctuations for a system with N degrees of freedom scales as 1/√N, the central limit theorem does not always apply (Ibid). Biology deals with living systems that are manifestly non-equilibrium, and even macroscopic systems can exhibit anomalously large fluctuations [48]. But we could argue that randomness is only at the interface when made relative to a particular information processing model. Model as regards its distribution of subjective and objective constituents can be considered an epistemic truth because it is relative to measurement and the capacity of an information-processing agent to predict, but this does not entail a denial of its ontological status—a process/event/object has an objective degree of randomness for any computational system relative to its computational power. There are various measures of randomness/complexity that are objective because they are true for any information processing system, such as Chaitin-Kolmogorov complexity [4951]. Kolmogorov complexity theory or algorithmic information theory states the minimum amount of information you need to replicate a given signal. It could be understood as the shortest computer program that produces a certain behaviour. An infinite string of 0s is very simple: no matter how long it is, it is just one character. The sequence “8.1446925...” is more complex; it might look random, but we could generate a relatively elementary program to reproduce it to any given level of precision. The shortest length description gets the picture about the objective degree of randomness of a sequence/object, but the randomness still is relative to the information processing system and comes into view, at the interface between the observer and the sequence/object. We can draw a parallel between this and the problem of the ontological status of information: the world is informational but you need information processing for information to exist. Is it credible to make any sort of distinction between a theoretical proposition and a reality which is definable only and exclusively in terms of measurability, i.e. a mathematisable reality?
I argue that even if randomness is located at the interface constituted by the computational bond of the measuring/cognising mind plus body, this should be considered part of the nature wherein measurement cannot but be applied. If this point is accepted, moreover, then what is there for the so-called interface to mediate? Between what, that is, does it intervene, and from where?

Final Remarks on the Definitional Spectrum of Noise

In the “The ‘Information Metaphor Falsehood’ and the ‘Glorification of Noise’” section, we acknowledged that there is no discrete informational value for any part of a biological system. For this reason, a taxonomical classification of the different types of noise or its potential interchangeability with stochasticity, randomness, variation, variability or uncertainty does not seem to be advisable.
We should now make a technical distinction (as per different fields of research and where they are actually useful) between the terms “variation”, “variability”, “randomness” and “uncertainty”.
We can conceive variability and uncertainty as two different classes of variation, each involving different sources and kinds of randomness. Authors such as Van Belle [52] understand variability as referring to natural variation, whereas uncertainty refers to the degree of accuracy with which a quantity is measured. According to Bravi and Longo [16] randomness “may be understood as unpredictability with respect to an intended theory” and measurement. For Longo it is a constructive or enabling constraint5; similar to Kauffman et al.’s [3] information, “constraints are information and information is constraints” but randomness (as well as noise) differs insofar as it presents a limit to predictability. Longo tries to make this limit precise in biology as “a component of production of an unpredictable” and “constructive production of diversity”. The overall problem of noise should be then reframed into an
… alternative epistemology of living beings which accounts for the structures of determination inherent to biology and for an autonomous definition of randomness sticking to this idea, history and contexts, as well as internal constraints of integration and regulation mechanisms, can be thought to constrain possible evolutionary paths that dynamically arise in the interaction with the environment rather than to determine the outcome (as determinism requires that the same effects derive from the same causes). ([16], p. 9)
According to Longo et al., as soon as we perceive what commonly is understood as noise taking a constructive role that leads the system towards robustness, they advocate for moving it into the category of functional randomness. The distinction between noise and randomness in its constructive role is thus of paramount importance. In agreement with Wilkins, Longo and his collaborators, I contend that they are not interchangeable. Rather, I argue that randomness is noise when it is interfering with a system, but as soon as it is integrated by the system as a stabilising element it becomes problematic to use the concept of noise, precisely because it is no longer perturbing the system. Thus, the distinction between the two is in reference to their functional roles. For Bravi and Longo (Ibid, p. 17) randomness in physics is “non deterministic or deterministic non-predictability within a pre-given phase space” while in biology randomness is “intrinsic indetermination given also by changing within a pre-given phase spaces (ontogenesis and phylogenesis)”.
We do need an alternative definition of noise as an enabling constraint, at once biophysical and cognitive, for the functioning of complex adaptive systems: a definition that helps us to overcome our own slippery understanding of the “botched dialectic”6 ([19], p. 7) that makes perturbations below the threshold of measurement and self-organising systems (that create spontaneous “order out of randomness”) intrinsic components of everything we qualify as something producing freedom, novelty, creativity and invention, in opposition to reductionist perspectives that align scientific knowledge with the characterisation of systems in terms of mechanistic determinism.

Acknowledgements

The author would like to thank two anonymous reviewers as well as Iain H. Grant, Inigo Wilkins, Giuseppe Longo and Sonia de Jager. All shortcomings, errors, etc. are of course the sole responsibility of the author.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​creativecommons.​org/​licenses/​by/​4.​0/​.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Fußnoten
1
Translated by Henri Atlan, see: Atlan [1], p. 508.
 
2
A criticism already outlined by Thom and Chumbley in “Stop Chance! Silence Noise!” [20] against the work of Monod [21], Prigogine and Stengers [22], Atlan [23], and Serres [24].
 
3
Entropy and noise are equated in Wiener [25]. Wiener established the connection between Shannon’s concept of entropy as a measure of uncertainty in communication with Schrödinger’s concept of negative entropy as a source of order in living organisms. Shannon’s theory looked at the accurate transmission of messages. In this context, the information transmitted is equivalent to the entropy of the information source: the higher the initial uncertainty (entropy), the higher the amount of information achieved in the end. For Shannon, higher entropy indicated more information; for Schrödinger, on the other hand, higher entropy indicated less order. Wiener brought into line these approaches. He redefined the notion of entropy, related not to the initial uncertainty (as in Shannon’s definition) but to the degree of uncertainty remaining after the message has been received. Higher entropy (noise) now entailed less information. Thus, while Shannon’s information is equivalent of regular entropy, Wiener’s information is equivalent of negative entropy. Due to this alteration in sign, information (order) now opposes entropy both in information theory and in biological organization. Wiener successfully presented a notion of information (negative entropy) as a general measure of certainty, order and organization in any given “system”, whether living or technical.
 
4
Please note that I am not talking of physical invariance/stability.
 
5
Such as the two sides of a coin (constraints) enabling to describe (and thus to determine) the toss of a coin.
 
6
See as well Châtelet [53], p. 27
 
Literatur
1.
Zurück zum Zitat Atlan H (1995) Comment le dieu biblique peut « aller au hasard » en hébreu mais pas en traduction. Meta: Journal Des Traducteurs 40(3):508 Atlan H (1995) Comment le dieu biblique peut « aller au hasard » en hébreu mais pas en traduction. Meta: Journal Des Traducteurs 40(3):508
2.
Zurück zum Zitat Lestas I, Vinnicombe G, Paulsson J (2010) Fundamental limits on the suppression of molecular fluctuations. Nature 467:174–178 Lestas I, Vinnicombe G, Paulsson J (2010) Fundamental limits on the suppression of molecular fluctuations. Nature 467:174–178
3.
Zurück zum Zitat Kauffman S, Logan RK, Este R, Goebel R, Hobill D, Shmulevich I (2008) Propagating organization: an enquiry. Biol Philos 23:27–45 Kauffman S, Logan RK, Este R, Goebel R, Hobill D, Shmulevich I (2008) Propagating organization: an enquiry. Biol Philos 23:27–45
4.
Zurück zum Zitat MacKay D (1969) Information, mechanism and meaning. MIT Press, Cambridge, MA MacKay D (1969) Information, mechanism and meaning. MIT Press, Cambridge, MA
5.
Zurück zum Zitat Sterelny K, Smith KC, Dickison M (1996) The extended replicator. Biol Philos 11:377–403 Sterelny K, Smith KC, Dickison M (1996) The extended replicator. Biol Philos 11:377–403
6.
Zurück zum Zitat Maclaurin J (1998) Reinventing molecular Weissmanism. Biol Philos 13(1):37–59 Maclaurin J (1998) Reinventing molecular Weissmanism. Biol Philos 13(1):37–59
7.
Zurück zum Zitat Maynard Smith J (2000) The concept of information in biology. Philos Sci 67(2):177–194 Maynard Smith J (2000) The concept of information in biology. Philos Sci 67(2):177–194
8.
Zurück zum Zitat Rao C, Wolf D, Arkin A (2002) Control, exploitation and tolerance of intracellular noise. Nature 420(6912):231–237 Rao C, Wolf D, Arkin A (2002) Control, exploitation and tolerance of intracellular noise. Nature 420(6912):231–237
9.
Zurück zum Zitat Vilar J, Kueh H, Barkai N, Leibler S (2002) Mechanisms of noise-resistance in genetic oscillators. Proc National Academy of Sciences of the U S A 99(9):5988–5992 Vilar J, Kueh H, Barkai N, Leibler S (2002) Mechanisms of noise-resistance in genetic oscillators. Proc National Academy of Sciences of the U S A 99(9):5988–5992
10.
Zurück zum Zitat Barkai N, Shilo B (2007) Variability and robustness in biomolecular systems. Mol Cell 28(5):755–760 Barkai N, Shilo B (2007) Variability and robustness in biomolecular systems. Mol Cell 28(5):755–760
11.
Zurück zum Zitat Blomberg C (2006) Fluctuations for good and bad: the role of noise in living systems. Phys Life Rev 3:133–161 Blomberg C (2006) Fluctuations for good and bad: the role of noise in living systems. Phys Life Rev 3:133–161
12.
Zurück zum Zitat Lewontin R (2000) The triple helix: gene, organism, and environment. Harvard University Press, Cambridge, MA Lewontin R (2000) The triple helix: gene, organism, and environment. Harvard University Press, Cambridge, MA
13.
Zurück zum Zitat Raser J, OShea E (2005) Noise in gene expression: origins, consequences, and control. Science 309(5743):2010–2013 Raser J, OShea E (2005) Noise in gene expression: origins, consequences, and control. Science 309(5743):2010–2013
14.
Zurück zum Zitat Kussell E, Kishony R, Balaban N, Leibler S (2005) Bacterial persistence: a model of survival in changing environments. Genetics 169:1807–1814 Kussell E, Kishony R, Balaban N, Leibler S (2005) Bacterial persistence: a model of survival in changing environments. Genetics 169:1807–1814
15.
Zurück zum Zitat Meyer H, Roeder A (2014) Stochasticity in plant cellular growth and patterning. Front Plant Sci 5:420 Meyer H, Roeder A (2014) Stochasticity in plant cellular growth and patterning. Front Plant Sci 5:420
16.
Zurück zum Zitat Bravi B, Longo G (2015) The unconventionality of nature: biology, from noise to functional randomness. Unconventional Computation & Natural Computation Conference (UCNC), Auckland (NZ) 31/8–4/9/2015. vol LNCS 9252, Lecture Notes in Computer Science, Springer, pp 3–34 Bravi B, Longo G (2015) The unconventionality of nature: biology, from noise to functional randomness. Unconventional Computation & Natural Computation Conference (UCNC), Auckland (NZ) 31/8–4/9/2015. vol LNCS 9252, Lecture Notes in Computer Science, Springer, pp 3–34
17.
Zurück zum Zitat Calude C, Longo G (2016) Classical, quantum and biological randomness as relative unpredictability. Nat Comput 15:263–278 Calude C, Longo G (2016) Classical, quantum and biological randomness as relative unpredictability. Nat Comput 15:263–278
18.
Zurück zum Zitat Perret N, Longo G (2016) Reductionist perspectives and the notion of information. In: Soto AM, Longo G, Noble D (eds) From the century of the genome to the century of the organism: New theoretical approaches (special issue), Prog Biophys Molecular Biology 122(1), pp 11–15 Perret N, Longo G (2016) Reductionist perspectives and the notion of information. In: Soto AM, Longo G, Noble D (eds) From the century of the genome to the century of the organism: New theoretical approaches (special issue), Prog Biophys Molecular Biology 122(1), pp 11–15
19.
Zurück zum Zitat Wilkins I (2020) Irreversible noise. Manuscript submitted for publication. Urbanomic, Falmouth Wilkins I (2020) Irreversible noise. Manuscript submitted for publication. Urbanomic, Falmouth
20.
Zurück zum Zitat Thom R, Chumbley R (1983) Stop chance! Silence noise! SubStance 12(3):11–21 Thom R, Chumbley R (1983) Stop chance! Silence noise! SubStance 12(3):11–21
21.
Zurück zum Zitat Monod J (1971) Chance and necessity: an essay on the natural philosophy of modern biology. Vintage Books, New York Monod J (1971) Chance and necessity: an essay on the natural philosophy of modern biology. Vintage Books, New York
22.
Zurück zum Zitat Prigogine I, Stengers I (1984) Order out of chaos. Man’s new dialogue with nature. Bantam Books, New York pp 11–15 Prigogine I, Stengers I (1984) Order out of chaos. Man’s new dialogue with nature. Bantam Books, New York pp 11–15
23.
Zurück zum Zitat Atlan H (1972) L’organisation biologique et la théorie de l’information. Hermann, Paris Atlan H (1972) L’organisation biologique et la théorie de l’information. Hermann, Paris
24.
Zurück zum Zitat Serres M (1980) Le parasite. Grasset, Paris Serres M (1980) Le parasite. Grasset, Paris
25.
Zurück zum Zitat Wiener N (1948) Cybernetics: or control and communication in the animal and the machine. MIT Press, Cambridge, MA Wiener N (1948) Cybernetics: or control and communication in the animal and the machine. MIT Press, Cambridge, MA
26.
Zurück zum Zitat Eldar A, Elowitz M (2010) Functional roles for noise in genetic circuits. Nature 467:167–173 Eldar A, Elowitz M (2010) Functional roles for noise in genetic circuits. Nature 467:167–173
27.
Zurück zum Zitat Elowitz B, Leibler S (2000) A synthetic oscillatory network of transcriptional translation. Nature 403(6767):335–338 Elowitz B, Leibler S (2000) A synthetic oscillatory network of transcriptional translation. Nature 403(6767):335–338
28.
Zurück zum Zitat Knuuttila T, Loettgers A (2011) Synthetic modeling and the functional role of noise, paper presented at epistemology of modeling and simulation:building research bridges between the philosophical and modeling communities, University Club, University of Pittsburgh, 1–3 April 2011 Knuuttila T, Loettgers A (2011) Synthetic modeling and the functional role of noise, paper presented at epistemology of modeling and simulation:building research bridges between the philosophical and modeling communities, University Club, University of Pittsburgh, 1–3 April 2011
29.
Zurück zum Zitat Becskei A, Serrano L (2000) Engineering stability in gene networks by autoregulation. Nature 405:590–593 Becskei A, Serrano L (2000) Engineering stability in gene networks by autoregulation. Nature 405:590–593
30.
Zurück zum Zitat Karig D, Martini K, Lu T, DeLateur N, Goldenfeld N, Weiss R (2018) Stochastic Turing patterns in a synthetic bacterial population. Proc Natl Acad Sci 115(26):6572–6577 Karig D, Martini K, Lu T, DeLateur N, Goldenfeld N, Weiss R (2018) Stochastic Turing patterns in a synthetic bacterial population. Proc Natl Acad Sci 115(26):6572–6577
32.
Zurück zum Zitat Turing A (1952) The chemical basis of morphogenesis. Philosophical transactions of the royal society of London. Series B, Biological Sciences 237(641):37–72 Turing A (1952) The chemical basis of morphogenesis. Philosophical transactions of the royal society of London. Series B, Biological Sciences 237(641):37–72
34.
Zurück zum Zitat Elowitz B, Levine A, Siggia E, Swain P (2002) Stochastic gene expression in a single cell. Science 297:1183–1186 Elowitz B, Levine A, Siggia E, Swain P (2002) Stochastic gene expression in a single cell. Science 297:1183–1186
35.
Zurück zum Zitat Ozbudak E, Thattai M, Kurtser I, Grossman A, van Oudenaarden A (2002) Regulation of noise in the expression of a single gene. Nat Genet 31:69–73 Ozbudak E, Thattai M, Kurtser I, Grossman A, van Oudenaarden A (2002) Regulation of noise in the expression of a single gene. Nat Genet 31:69–73
36.
Zurück zum Zitat Swain P, Elowitz M, Siggia E (2002) Intrinsic and extrinsic contributions to stochasticity in gene expression. Proc Natl Acad Sci U S A 99:12795–12800 Swain P, Elowitz M, Siggia E (2002) Intrinsic and extrinsic contributions to stochasticity in gene expression. Proc Natl Acad Sci U S A 99:12795–12800
37.
Zurück zum Zitat Ciechonska M, Grob A, Isalan M (2016) From noise to synthetic nucleoli: can synthetic biology achieve new insights? Integr Biol 8(4):383–393 Ciechonska M, Grob A, Isalan M (2016) From noise to synthetic nucleoli: can synthetic biology achieve new insights? Integr Biol 8(4):383–393
39.
Zurück zum Zitat Knuuttila T, Loettgers A (2014) Varieties of noise: analogical reasoning in synthetic biology. Stud Hist Phil Sci 48:76–88 Knuuttila T, Loettgers A (2014) Varieties of noise: analogical reasoning in synthetic biology. Stud Hist Phil Sci 48:76–88
40.
Zurück zum Zitat Loettgers A (2009) Synthetic biology and the emergence of a dual meaning of noise. Biol Theory 4(4):340–356 Loettgers A (2009) Synthetic biology and the emergence of a dual meaning of noise. Biol Theory 4(4):340–356
41.
Zurück zum Zitat Longo G (2018) How future depends on past and rare events in systems of life. Found Sci 23(3):443–474 Longo G (2018) How future depends on past and rare events in systems of life. Found Sci 23(3):443–474
42.
Zurück zum Zitat Turing A (1936) On computable numbers with an application to the Entscheidungsproblem. Proc Lond Math Soc 42:230–265 Turing A (1936) On computable numbers with an application to the Entscheidungsproblem. Proc Lond Math Soc 42:230–265
43.
Zurück zum Zitat Brillouin L (1962) Science and information theory. Academic Press, New York Brillouin L (1962) Science and information theory. Academic Press, New York
44.
Zurück zum Zitat Shannon C (1948) A mathematical theory of communication. Bell Syst Tech J 27:379–423 Shannon C (1948) A mathematical theory of communication. Bell Syst Tech J 27:379–423
45.
Zurück zum Zitat Darwin C (1859) On the origin of species by means of natural selection, or the preservation of favoured races in the struggle for life. John Murray, London Darwin C (1859) On the origin of species by means of natural selection, or the preservation of favoured races in the struggle for life. John Murray, London
46.
Zurück zum Zitat Borges JL (1949) ‘El Aleph’, in El Aleph, Buenos Aires: Losada (Eng. trans. by A. Kerrigan, ‘The Aleph’, in a personal anthology, New York: Grove, 1967) Borges JL (1949) ‘El Aleph’, in El Aleph, Buenos Aires: Losada (Eng. trans. by A. Kerrigan, ‘The Aleph’, in a personal anthology, New York: Grove, 1967)
47.
Zurück zum Zitat Tsimring L (2014) Noise in biology. Rep Prog Phys 77(2):026601 Tsimring L (2014) Noise in biology. Rep Prog Phys 77(2):026601
48.
Zurück zum Zitat Keizer J (1987) Statistical thermodynamics of nonequilibrium processes. Springer, Berlin Keizer J (1987) Statistical thermodynamics of nonequilibrium processes. Springer, Berlin
49.
Zurück zum Zitat Chaitin G (1966) On the length of programs for computing finite binary sequences. J ACM 13(4):547–569 Chaitin G (1966) On the length of programs for computing finite binary sequences. J ACM 13(4):547–569
50.
Zurück zum Zitat Kolmogorov A (1965) Three approaches to the quantitative definition of information. Probl Inf Transm 1(1):1–7 Kolmogorov A (1965) Three approaches to the quantitative definition of information. Probl Inf Transm 1(1):1–7
51.
Zurück zum Zitat Kolmogorov A (1983) Combinatorial foundations of information theory and the calculus of probabilities. Russ Math Surv 38(4):27–36 Kolmogorov A (1983) Combinatorial foundations of information theory and the calculus of probabilities. Russ Math Surv 38(4):27–36
52.
Zurück zum Zitat van Belle G (2008) Statistical rules of thumb. Wiley, Hoboken, NJ van Belle G (2008) Statistical rules of thumb. Wiley, Hoboken, NJ
53.
Zurück zum Zitat Châtelet G (2014) To live and think like pigs. Urbanomic & Sequence Press, Falmouth Châtelet G (2014) To live and think like pigs. Urbanomic & Sequence Press, Falmouth
Metadaten
Titel
Noise and Synthetic Biology: How to Deal with Stochasticity?
verfasst von
Miguel Prado Casanova
Publikationsdatum
17.04.2020
Verlag
Springer Netherlands
Erschienen in
NanoEthics / Ausgabe 1/2020
Print ISSN: 1871-4757
Elektronische ISSN: 1871-4765
DOI
https://doi.org/10.1007/s11569-020-00366-4

Weitere Artikel der Ausgabe 1/2020

NanoEthics 1/2020 Zur Ausgabe

Art-Science Interaction

BrisSynBio Art-Science Dossier