The recent discussion on scientific representation has focused on models and their relationship to the real world. It has been assumed that models give us knowledge because they represent their supposed real target systems. However, here agreement among philosophers of science has tended to end as they have presented widely different views on how representation should be understood. I will argue that the traditional representational approach is too limiting as regards the epistemic value of modelling given the focus on the (...) relationship between a single model and its supposed target system, and the neglect of the actual representational means with which scientists construct models. I therefore suggest an alternative account of models as epistemic tools. This amounts to regarding them as concrete artefacts that are built by specific representational means and are constrained by their design in such a way that they facilitate the study of certain scientific questions, and learning from them by means of construction and manipulation. (shrink)
Representation has been one of the main themes in the recent discussion of models. Several authors have argued for a pragmatic approach to representation that takes users and their interpretations into account. It appears to me, however, that this emphasis on representation places excessive limitations on our view of models and their epistemic value. Models should rather be thought of as epistemic artifacts through which we gain knowledge in diverse ways. Approaching models this way stresses their materiality and media-specificity. Focusing (...) on models as multi-functional artifacts loosens them from any pre-established and fixed representational relationships and leads me to argue for a two-fold approach to representation. (shrink)
ABSTRACT Is there something specific about modelling that distinguishes it from many other theoretical endeavours? We consider Michael Weisberg’s thesis that modelling is a form of indirect representation through a close examination of the historical roots of the Lotka–Volterra model. While Weisberg discusses only Volterra’s work, we also study Lotka’s very different design of the Lotka–Volterra model. We will argue that while there are elements of indirect representation in both Volterra’s and Lotka’s modelling approaches, they are largely due to two (...) other features of contemporary model construction processes that Weisberg does not explicitly consider: the methods-drivenness and outcome-orientedness of modelling. 1Introduction 2Modelling as Indirect Representation 3The Design of the Lotka–Volterra Model by Volterra 3.1Volterra’s method of hypothesis 3.2The construction of the Lotka–Volterra model by Volterra 4The Design of the Lotka–Volterra Model by Lotka 4.1Physical biology according to Lotka 4.2Lotka’s systems approach and the Lotka–Volterra model 5Philosophical Discussion: Strategies and Tools of Modelling 5.1Volterra’s path from the method of isolation to the method of hypothesis 5.2The template-based approach of Lotka 5.3Modelling: methods-driven and outcome-oriented 6Conclusion. (shrink)
Deidealization as a topic in its own right has attracted remarkably little philosophical interest despite the extensive literature on idealization. One reason for this is the often implicit assumption that idealization and deidealization are, potentially at least, reversible processes. We question this assumption by analyzing the challenges of deidealization within a menu of four broad categories: deidealizing as recomposing, deidealizing as reformulating, deidealizing as concretizing, and deidealizing as situating. On closer inspection, models turn out much more inflexible than the reversal (...) thesis would have us believe, and deidealization emerges as a creative part of modeling. (shrink)
This paper presents an artifactual approach to models that also addresses their fictional features. It discusses first the imaginary accounts of models and fiction that set model descriptions apart from imagined-objects, concentrating on the latter :251–268, 2010; Frigg and Nguyen in The Monist 99:225–242, 2016; Godfrey-Smith in Biol Philos 21:725–740, 2006; Philos Stud 143:101–116, 2009). While the imaginary approaches accommodate surrogative reasoning as an important characteristic of scientific modeling, they simultaneously raise difficult questions concerning how the imagined entities are related (...) to actual representational tools, and coordinated among different scientists, and with real-world phenomena. The artifactual account focuses, in contrast, on the culturally established external representational tools that enable, embody, and extend scientific imagination and reasoning. While there are commonalities between models and fictions, it is argued that the focus should be on the fictional uses of models rather than considering models as fictions. (shrink)
The epistemic value of models has traditionally been approached from a representational perspective. This paper argues that the artifactual approach evades the problem of accounting for representation and better accommodates the modal dimension of modeling. From an artifactual perspective, models are viewed as erotetic vehicles constrained by their construction and available representational tools. The modal dimension of modeling is approached through two case studies. The first portrays mathematical modeling in economics, while the other discusses the modeling practice of synthetic biology, (...) which exploits and combines models in various modes and media. Neither model intends to represent any actual target system. Rather, they are constructed to study possible mechanisms through the construction of a model system with built-in dependencies. (shrink)
This paper examines two recent approaches to the nature and functioning of economic models: models as isolating representations and models as credible constructions. The isolationist view conceives of economic models as surrogate systems that isolate some of the causal mechanisms or tendencies of their respective target systems, while the constructionist approach treats them rather like pure constructions or fictional entities that nevertheless license different kinds of inferences. I will argue that whereas the isolationist view is still tied to the representationalist (...) understanding of models that takes the model-target dyad as the basic unit of analysis, the constructionist perspective can better accommodate the way we actually acquire knowledge through them. Using the example of Tobin’s ultra-Keynesian model I will show how many of the epistemic characteristics of modelling tend to go unrecognised if too much focus is placed on the model-target dyad. (shrink)
One striking feature of the contemporary modelling practice is its interdisciplinary nature. The same equation forms, and mathematical and computational methods, are used across different disciplines, as well as within the same discipline. Are there, then, differences between intra- and interdisciplinary transfer, and can the comparison between the two provide more insight on the challenges of interdisciplinary theoretical work? We will study the development and various uses of the Ising model within physics, contrasting them to its applications to socio-economic systems. (...) While the renormalization group methods justify the transfer of the Ising model within physics – by ascribing them to the same universality class – its application to socio-economic phenomena has no such theoretical grounding. As a result, the insights gained by modelling socio-economic phenomena by the Ising model may remain limited. (shrink)
Our concern is in explaining how and why models give us useful knowledge. We argue that if we are to understand how models function in the actual scientific practice the representational approach to models proves either misleading or too minimal. We propose turning from the representational approach to the artefactual, which implies also a new unit of analysis: the activity of modelling. Modelling, we suggest, could be approached as a specific practice in which concrete artefacts, i.e., models, are constructed with (...) the help of specific representational means and used in various ways, for example, for the purposes of scientific reasoning, theory construction and design of experiments and other artefacts. Furthermore, in this activity of modelling the model construction is intertwined with the construction of new phenomena, theoretical principles and new scientific concepts. We will illustrate these claims by studying the construction of the ideal heat engine by Sadi Carnot. (shrink)
The picture of synthetic biology as a kind of engineering science has largely created the public understanding of this novel field, covering both its promises and risks. In this paper, we will argue that the actual situation is more nuanced and complex. Synthetic biology is a highly interdisciplinary field of research located at the interface of physics, chemistry, biology, and computational science. All of these fields provide concepts, metaphors, mathematical tools, and models, which are typically utilized by synthetic biologists by (...) drawing analogies between the different fields of inquiry. We will study analogical reasoning in synthetic biology through the emergence of the functional meaning of noise, which marks an important shift in how engineering concepts are employed in this field. The notion of noise serves also to highlight the differences between the two branches of synthetic biology: the basic science-oriented branch and the engineering-oriented branch, which differ from each other in the way they draw analogies to various other fields of study. Moreover, we show that fixing the mapping between a source domain and the target domain seems not to be the goal of analogical reasoning in actual scientific practice. (shrink)
Synthetic biology has a strong modal dimension that is part and parcel of its engineering agenda. In turning hypothetical biological designs into actual synthetic constructs, synthetic biologists reach towards potential biology instead of concentrating on naturally evolved organisms. We analyze synthetic biology’s goal of making biology easier to engineer through the combinatorial theory of possibility, which reduces possibility to combinations of individuals and their attributes in the actual world. While the last decades of synthetic biology explorations have shown biology to (...) be much more difficult to engineer than originally conceived, synthetic biology has not given up its combinatorial approach. (shrink)
The recent discussion of fictional models has focused on imagination, implicitly considering fictions as something nonconcrete. We present two cases from synthetic biology that can be viewed as concrete fictions. Both minimal cells and alternative genetic systems are modal in nature: they, as well as their abstract cousins, can be used to study unactualized possibilia. We approach these synthetic constructs through Vaihinger’s notion of a semi-fiction and Goodman’s notion of semifactuality. Our study highlights the relative existence of such concrete fictions. (...) Before their realizations neither minimal cells nor alternative genetic systems were any well-defined objects, and the subsequent experimental work has given more content to these originally schematic imaginings. But it is as yet unclear whether individual members of these heterogeneous groups of somewhat functional synthetic constructs will eventually turn out to be fully realizable, remain only partially realizable, or prove outright impossible. (shrink)
One striking feature of the contemporary modeling practice is its interdisciplinarity: the same function forms and equations, and mathematical and computational methods are being transferred across disciplinary boundaries. Within philosophy of science this interdisciplinary dimension of modeling has been addressed by both analogy and template-based approaches that have proceeded separately from each other. We argue that a more fully-blown account of model transfer needs both perspectives. We examine analogical reasoning and template application through a detailed case study on the transfer (...) of the Ising model from physics into neuroscience. Our account combines the analogy and template-based approaches through the notion of a model template that highlights the conceptual side of model transfer. (shrink)
Synthetic biology is often understood in terms of the pursuit for well-characterized biological parts to create synthetic wholes. Accordingly, it has typically been conceived of as an engineering dominated and application oriented field. We argue that the relationship of synthetic biology to engineering is far more nuanced than that and involves a sophisticated epistemic dimension, as shown by the recent practice of synthetic modeling. Synthetic models are engineered genetic networks that are implanted in a natural cell environment. Their construction is (...) typically combined with experiments on model organisms as well as mathematical modeling and simulation. What is especially interesting about this combinational modeling practice is that, apart from greater integration between these different epistemic activities, it has also led to the questioning of some central assumptions and notions on which synthetic biology is based. As a result synthetic biology is in the process of becoming more “biology inspired.”. (shrink)
Recently, Bechtel and Abrahamsen have argued that mathematical models study the dynamics of mechanisms by recomposing the components and their operations into an appropriately organized system. We will study this claim through the practice of combinational modeling in circadian clock research. In combinational modeling, experiments on model organisms and mathematical/computational models are combined with a new type of model—a synthetic model. We argue that the strategy of recomposition is more complicated than what Bechtel and Abrahamsen indicate. Moreover, synthetic modeling as (...) a kind of material recomposition strategy also points beyond the mechanistic paradigm. (shrink)
The attempt to define life has gained new momentum in the wake of novel fields such as synthetic biology, astrobiology, and artificial life. In a series of articles, Cleland, Chyba, and Machery claim that definitions of life seek to provide necessary and sufficient conditions for applying the concept of life—something that such definitions cannot, and should not do. We argue that this criticism is largely unwarranted. Cleland, Chyba, and Machery approach definitions of life as classifying devices, thereby neglecting their other (...) epistemic roles. We identify within the discussions of the nature and origin of life three other types of definitions: theoretical, transdisciplinary, and diagnostic definitions. The primary aim of these definitions is not to distinguish life from nonlife, although they can also be used for classificatory purposes. We focus on the definitions of life within the budding field of astrobiology, paying particular attention to transdisciplinary definitions, and diagnostic definitions in the search for biosignatures from other planets. (shrink)
Natalia Carrillo and Tarja Knuuttila claim that there are two traditions of thinking about idealization offering almost opposite views on their functioning and epistemic status. While one tradition views idealizations as epistemic deficiencies, the other one highlights the epistemic benefits of idealization. Both of them treat idealizations as deliberate misrepresentations, however. They then argue for an artifactual account of idealization, comparing it to the traditional accounts of idealization, and exemplifying it through the Hodgkin and Huxley model of the nerve impulse. (...) From the artifactual perspective, the epistemic benefits and deficiencies introduced by idealization frequently come in a package due to the way idealization draws together different resources in model construction. Accordingly, idealization tends to be holistic in that it is not often easily attributable to some specific parts of the model. They conclude that the artifactual approach offers a unifying view into idealization in that it is able to recover several basic philosophical insights motivating both the deficiency and epistemic benefit accounts, while being simultaneously detached from the idea of distortion by misrepresentation. (shrink)
Synthetic biology is often understood in terms of the pursuit for well-characterized biological parts to create synthetic wholes. Accordingly, it has typically been conceived of as an engineering dominated and application oriented field. We argue that the relationship of synthetic biology to engineering is far more nuanced than that and involves a sophisticated epistemic dimension, as shown by the recent practice of synthetic modeling. Synthetic models are engineered genetic networks that are implanted in a natural cell environment. Their construction is (...) typically combined with experiments on model organisms as well as mathematical modeling and simulation. What is especially interesting about this combinational modeling practice is that, apart from greater integration between these different epistemic activities, it has also led to the questioning of some central assumptions and notions on which synthetic biology is based. As a result synthetic biology is in the process of becoming more “biology inspired.”. (shrink)
How do philosophers of science make use of historical case studies? Are their accounts of historical cases purpose-built and lacking in evidential strength as a result of putting forth and discussing philosophical positions? We will study these questions through the examination of three different philosophical case studies. All of them focus on modeling and on Vito Volterra, contrasting his work to that of other theoreticians. We argue that the worries concerning the evidential role of historical case studies in philosophy are (...) partially unfounded, and the evidential and hermeneutical roles of case studies need not be played against each other. In philosophy of science, case studies are often tied to conceptual and theoretical analysis and development, rendering their evidential and theoretic/hermeneutic roles intertwined. Moreover, the problems of resituating or generalizing local knowledge are not specific to philosophy of science but commonplace in many scientific practices—which show similarities to the actual use of historical case studies by philosophers of science. (shrink)
Idealization is commonly understood as distortion: representing things differently than how they actually are. In this paper, we outline an alternative artifactual approach that does not make misrepresentation central for the analysis of idealization. We examine the contrast between the Hodgkin-Huxley (1952a, b, c) and the Heimburg-Jackson (2005, 2006) models of the nerve impulse from the artifactual perspective, and argue that, since the two models draw upon different epistemic resources and research programs, it is often difficult to tell which features (...) of a system the central assumptions involved are supposed to distort. Many idealizations are holistic in nature. They cannot be locally undone without dismantling the model, as they occupy a central position in the entire research program. Nor is their holistic character mainly related to the use of mathematical and statistical modeling techniques as portrayed by Rice (2018, 2019). We suggest that holistic idealizations are implicit theoretical and representational assumptions that can only be understood in relation to the conceptual and representational tools exploited in modeling and experimental practices. Such holistic idealizations play a pivotal role not just in individual models, but also in defining research programs. -/- . (shrink)
This paper distinguishes between causal isolation robustness analysis and independent determination robustness analysis and suggests that the triangulation of the results of different epistemic means or activities serves different functions in them. Circadian clock research is presented as a case of causal isolation robustness analysis: in this field researchers made use of the notion of robustness to isolate the assumed mechanism behind the circadian rhythm. However, in contrast to the earlier philosophical case studies on causal isolation robustness analysis (Weisberg and (...) Reisman in Philos Sci 75:106–131, 2008 ; Kuorikoski et al. in Br J Philos Sci 61:541–567, 2010 ), robustness analysis in the circadian clock research did not remain in the level of mathematical modeling, but it combined it with experimentation on model organisms and a new type of model, a synthetic model. (shrink)
One of the most conspicuous features of contemporary modeling practices is the dissemination of mathematical and computational methods across disciplinary boundaries. We study this process through two applications of the Ising model: the Sherrington-Kirkpatrick model of spin glasses and the Hopfield model of associative memory. The Hopfield model successfully transferred some basic ideas and mathematical methods originally developed within the study of magnetic systems to the field of neuroscience. As an analytical resource we use Paul Humphreys's discussion of computational and (...) theoretical templates. We argue that model templates are crucial for the intra- and interdisciplinary theoretical transfer. A model template is an abstract conceptual idea associated with particular mathematical forms and computational methods. (shrink)
The purpose of this paper is to suggest that models in scientific practice can be conceived of as epistemic artifacts. Approaching models this way accommodates many such things that working scientists themselves call models but that the semantic conception of models does not duly recognize as such. That models are epistemic artifacts implies, firstly, that they cannot be understood apart from purposeful human activity; secondly, that they are somehow materialized inhabitants of the intersubjective field of that activity; and thirdly, that (...) they can function also as knowledge objects. We argue that models as epistemic artifacts provide knowledge in many other ways than just via direct representative links. To substantiate our view we use a language‐technological artifact, a parser, as an example. (shrink)
There are two traditions of thinking about idealization offering almost opposite views on their functioning and epistemic status. While one tradition views idealizations as epistemic deficiencies, the other one highlights the epistemic benefits of idealization. Both of these, however, identify idealization with misrepresentation. In this article, we instead approach idealization from the artifactual perspective, comparing it to the distortion-to-reality accounts of idealization, and exemplifying it through the case of the Hodgkin and Huxley model of nerve impulse. From the artifactual perspective, (...) the epistemic benefits and deficiencies introduced by idealization frequently come in a package due to the way idealization draws together different resources in model construction. Accordingly, idealization tends to be holistic in that it is not often easily attributable to just some specific parts of the model. Instead, the idealizing process tightly embeds theoretical concepts and formal tools into the construction of a model. (shrink)
In his famous article “The Unreasonable Effectiveness of Mathematics in the Natural Sciences” Eugen Wigner argues for a unique tie between mathematics and physics, invoking even religious language: “The miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve”. The possible existence of such a unique match between mathematics and physics has been extensively discussed by philosophers and historians of mathematics. Whatever the merits (...) of this claim are, a further question can be posed with regard to mathematization in science more generally: What happens when we leave the area of theories and laws of physics and move over to the realm of mathematical modeling in interdisciplinary contexts? Namely, in modeling the phenomena specific to biology or economics, for instance, scientists often use methods that have their origin in physics. How is this kind of mathematical modeling justified? (shrink)
The growing role of universities in the knowledge economy as well as technology transfer has increasingly been conceptualized in terms of the hybridization of public academic work and private business activity. In this article, we examine the difficulties and prospects of this kind of intermingling by studying the long-term trajectories of two research groups operating in the fields of plant biotechnology and language technology. In both cases, the attempts to simultaneously pursue academic and commercial activities led to complicated boundary maintenance, (...) which arose from the conflicting procedures and requirements of the two activities as well as from the double roles assumed by the actors involved. We, thus, argue that the construction of boundaries is not as contingent and strategic as has often been assumed but is built, instead, on the characteristic goals and tasks of the activities in question. Moreover, we suggest that the discussion on university—industry relationships and the entrepreneurial university has by and large neglected the fact that most universities are either public sector entities or tax-exempt organizations thereby being subject to strict rules and regulations that govern the ways in which they may become engaged in commercial activities. Furthermore, several other enduring cultural features, such as the university’s commitment to open scholarly communication, make the boundary between university and commerce relatively stable. As a consequence, the results of this study lend support to the thesis according to which boundaries in science are not always created at will but reflect the long history and multifaceted societal relevance of this particular institution. This in turn implies that the commodification of university research is bound to be more difficult than what the proponents of the entrepreneurial university seem to assume. (shrink)
In which respects do modeling and experimenting resemble or differ from each other? We explore this question through studying in detail the combinatorial strategy in synthetic biology whereby scientists triangulate experimentation on model organisms, mathematical modeling, and synthetic modeling. We argue that this combinatorial strategy is due to the characteristic constraints of the three epistemic activities. Moreover, our case study shows that in some cases materiality clearly matters, in fact it provides the very rationale of synthetic modeling. We will show (...) how the materialities of the different kinds of models – biological components versus mathematical symbols – in combination with their different structures – the complexity of biological organisms versus the isolated network structure and its mathematical dynamics - define the spectrum of epistemic possibilities in synthetic biology. Furthermore, our case shows that from the perspective of scientific practice the question of whether or not simulations are like or unlike experiments is often beside the point, since they are used to accomplish different kinds of things. (shrink)
Although the emerging field of synthetic biology looks back on barely a decade of development, the stakes are high. It is a multidisciplinary research field that aims at integrating the life sciences with engineering and the physical/chemical sciences. The common goal is to design and construct novel biological components, functions and systems in order to implement, in a controlled way, biological devices and production systems not necessarily found in nature. Among the many potential applications are novel drugs and pesticides, cancer (...) treatments, biofuels, and new materials. According to the most optimistic visions, synthetic biology may thus lead to a biotechnological revolution by transforming microorganisms into ‘factories’ of sorts, which could eventually displace conventional industrial methods. (shrink)
The proponents of the entrepreneurial university have claimed that it implies adjustments in the normative structure of science. In this article, I will critically examine whether a qualitatively new kind of academic ethos can emerge from the commercialization of academic research. The traditional conception of norms of science as institutionalized imperatives is distinguished from the constructivist conception of norms as strategic or ideological resources. An empirical case study on the commercialization of the research of one academic language-technology group is presented. (...) The case study does not support the constructivist conclusion that the norms of science are malleable at will. (shrink)
This paper discusses modeling from the artifactual perspective. The artifactual approach conceives models as erotetic devices. They are purpose-built systems of dependencies that are constrained in view of answering a pending scientific question, motivated by theoretical or empirical considerations. In treating models as artifacts, the artifactual approach is able to address the various languages of sciences that are overlooked by the traditional accounts that concentrate on the relationship of representation in an abstract and general manner. In contrast, the artifactual approach (...) focuses on epistemic affordances of different kinds of external representational and other tools employed in model construction. In doing so, the artifactual account gives a unified treatment of different model types as it circumvents the tendency of the fictional and other representational approaches to separate model systems from their “model descriptions”. (shrink)
This paper examines two parallel discussions of scientific modeling which have invoked experimentation in addressing the role of models in scientific inquiry. One side discusses the experimental character of models, whereas the other focuses on their exploratory uses. Although both relate modeling to experimentation, they do so differently. The former has considered the similarities and differences between models and experiments, addressing, in particular, the epistemic value of materiality. By contrast, the focus on exploratory modeling has highlighted the various kinds of (...) exploratory functions of models in the early stages of inquiry. These two perspectives on modeling are discussed through a case study in the field of synthetic biology. The research practice in question explores biological control by making use of an ensemble of different epistemic means: mathematical models and simulations, synthetic genetic circuits and intracellular measuring devices, and finally electronic circuits. We argue that the study of exploratory modeling should trace the ways different epistemic means, in different materialities, are being combined over time. Finally, the epistemic status of such novel investigative objects as synthetic genetic circuits is evaluated, with the conclusion that they can function as both experiments and models. (shrink)
Scientists have used models for hundreds of years as a means of describing phenomena and as a basis for further analogy. In Scientific Models in Philosophy of Science, Daniela Bailer-Jones assembles an original and comprehensive philosophical analysis of how models have been used and interpreted in both historical and contemporary contexts. Bailer-Jones delineates the many forms models can take (ranging from equations to animals; from physical objects to theoretical constructs), and how they are put to use. She examines early mechanical (...) models employed by nineteenth-century physicists such as Kelvin and Maxwell, describes their roots in the mathematical principles of Newton and others, and compares them to contemporary mechanistic approaches. Bailer-Jones then views the use of analogy in the late nineteenth century as a means of understanding models and to link different branches of science. She reveals how analogies can also be models themselves, or can help to create them. The first half of the twentieth century saw little mention of models in the literature of logical empiricism. Focusing primarily on theory, logical empiricists believed that models were of temporary importance, flawed, and awaiting correction. The later contesting of logical empiricism, particularly the hypothetico-deductive account of theories, by philosophers such as Mary Hesse, sparked a renewed interest in the importance of models during the 1950s that continues to this day. Bailer-Jones analyzes subsequent propositions of: models as metaphors; Kuhn's concept of a paradigm; the Semantic View of theories; and the case study approaches of Cartwright and Morrison, among others. She then engages current debates on topics such as phenomena versus data, the distinctions between models and theories, the concepts of representation and realism, and the discerning of falsities in models. (shrink)
Although the interdisciplinary nature of contemporary biological sciences has been addressed by philosophers, historians, and sociologists of science, the different ways in which engineering concepts and methods have been applied in biology have been somewhat neglected. We examine - using the mechanistic philosophy of science as an analytic springboard - the transfer of network methods from engineering to biology through the cases of two biology laboratories operating at the California Institute of Technology. The two laboratories study gene regulatory networks, but (...) in remarkably different ways. The research strategy of the Davidson lab fits squarely into the traditional mechanist philosophy in its aim to decompose and reconstruct, in detail, gene regulatory networks of a chosen model organism. In contrast, the Elowitz lab constructs minimal models that do not attempt to represent any particular naturally evolved genetic circuits. Instead, it studies the principles of gene regulation through a template-based approach that is applicable to any kinds of networks, whether biological or not. We call for the mechanists to consider whether the latter approach can be accommodated by the mechanistic approach, and what kinds of modifications it would imply for the mechanistic paradigm of explanation, if it were to address modelling more generally. (shrink)
In synthetic biology the use of engineering metaphors to describe biological organisms and their behavior has become a common practice. The concept of noise provides one of the most compelling examples of such transfer. But this notion is also confusing: While in engineering noise is a destructive force perturbing artificial systems, in synthetic biology it has acquired an additional functional meaning. It has been found out that noise is an important factor in driving biological processes such as gene regulation, development, (...) and evolution. How did noise acquire this dual meaning in the field of synthetic biology? In this paper we study the emergence of the functional meaning of noise in relation to synthetic modeling. We will pay particular attention to the interdisciplinary aspects of this process highlighting the way borrowed concepts, analogical reasoning and the use of cross-disciplinary computational templates were entwined in it. (shrink)