Research on the human microbiome has gen- erated a staggering amount of sequence data, revealing variation in microbial diversity at the community, species (or phylotype), and genomic levels. In order to make this complexity more manageable and easier to interpret, new units—the metagenome, core microbiome, and entero- type—have been introduced in the scientific literature. Here, I argue that analytical tools and exploratory statisti- cal methods, coupled with a translational imperative, are the primary drivers of this new ontology. By reducing the (...) dimensionality of variation in the human microbiome, these new units render it more tractable and easier to interpret, and hence serve an important heuristic role. Nonetheless, there are several reasons to be cautious about these new categories prematurely ‘‘hardening’’ into natural units: a lack of constraints on what can be sequenced metagenomically, freedom of choice in taxonomic level in defining a ‘‘core microbiome,’’ typological framing of some of the concepts, and possible reification of statistical constructs. Finally, lessons from the Human Genome Project have led to a translational imperative: a drive to derive results from the exploration of microbiome variation that can help to articulate the emerging paradigm of per- sonalized genomic medicine (PGM). There is a tension between the typologizing inherent in much of this research and the personal in PGM. (shrink)
Research on the human microbiome has generated a staggering amount of sequence data, revealing variation in microbial diversity at the community, species (or phylotype), and genomic levels. In order to make this complexity more manageable and easier to interpret, new units—the metagenome, core microbiome, and enterotype—have been introduced in the scientific literature. Here, I argue that analytical tools and exploratory statistical methods, coupled with a translational imperative, are the primary drivers of this new ontology. By reducing the dimensionality of variation (...) in the human microbiome, these new units render it more tractable and easier to interpret, and hence serve an important heuristic role. Nonetheless, there are several reasons to be cautious about these new categories prematurely ‘‘hardening’’ into natural units: a lack of constraints on what can be sequenced metagenomically, freedom of choice in taxonomic level in defining a ‘‘core microbiome,’’ typological framing of some of the concepts, and possible reification of statistical constructs. Finally, lessons from the Human Genome Project have led to a translational imperative: a drive to derive results from the exploration of microbiome variation that can help to articulate the emerging paradigm of personalized genomic medicine (PGM). There is a tension between the typologizing inherent in much of this research and the personal in PGM. (shrink)
As the international genomic research community moves from the tool-making efforts of the Human Genome Project into biomedical applications of those tools, new metaphors are being suggested as useful to understanding how our genes work – and for understanding who we are as biological organisms. In this essay we focus on the Human Microbiome Project as one such translational initiative. The HMP is a new ‘metagenomic’ research effort to sequence the genomes of human microbiological flora, in order to pursue the (...) interesting hypothesis that our ‘microbiome’ plays a vital and interactive role with our human genome in normal human physiology. Rather than describing the human genome as the ‘blueprint’ for human nature, the promoters of the HMP stress the ways in which our primate lineage DNA is interdependent with the genomes of our microbiological flora. They argue that the human body should be understood as an ecosystem with multiple ecological niches and habitats in which a variety of cellular species collaborate and compete, and that human beings should be understood as ‘superorganisms’ that incorporate multiple symbiotic cell species into a single individual with very blurry boundaries. These metaphors carry interesting philosophical messages, but their inspiration is not entirely ideological. Instead, part of their cachet within genome science stems from the ways in which they are rooted in genomic research techniques, in what philosophers of science have called a ‘tools-to-theory’ heuristic. Their emergence within genome science illustrates the complexity of conceptual change in translational research, by showing how it reflects both aspirational and methodological influences. (shrink)
As the international genomic research community moves from the tool-making efforts of the Human Genome Project into biomedical applications of those tools, new metaphors are being suggested as useful to understanding how our genes work – and for understanding who we are as biological organisms. In this essay we focus on the Human Microbiome Project as one such translational initiative. The HMP is a new ‘metagenomic’ research effort to sequence the genomes of human microbiological flora, in order to pursue the (...) interesting hypothesis that our ‘microbiome’ plays a vital and interactive role with our human genome in normal human physiology. Rather than describing the human genome as the ‘blueprint’ for human nature, the promoters of the HMP stress the ways in which our primate lineage DNA is interdependent with the genomes of our microbiological flora. They argue that the human body should be understood as an ecosystem with multiple ecological niches and habitats in which a variety of cellular species collaborate and compete, and that human beings should be understood as ‘superorganisms’ that incorporate multiple symbiotic cell species into a single individual with very blurry boundaries. These metaphors carry interesting philosophical messages, but their inspiration is not entirely ideological. Instead, part of their cachet within genome science stems from the ways in which they are rooted in genomic research techniques, in what philosophers of science have called a ‘tools-to-theory’ heuristic. Their emergence within genome science illustrates the complexity of conceptual change in translational research, by showing how it reflects both aspirational and methodological influences. (shrink)
In this paper, I discuss several temporal aspects of paleontology from a philosophical perspective. I begin by presenting the general problem of “taming” deep time to make it comprehensible at a human scale, starting with the traditional geologic time scale: an event-based, relative time scale consisting of a hierarchy of chronological units. Not only does the relative timescale provide a basis for reconstructing many of the general features of the history of life, but it is also consonant with the cognitive (...) processes humans use to think about time. Absolute dating of rocks, fossils, and evolutionary events (such as branching events on the tree of life) can be accomplished through the use of radiometric dating, chronological signals extractable from fossil growth patterns, and the “molecular clock.” Sometimes these different methods of absolute dating, which start from largely independent assumptions and evidentiary bases, converge in their temporal estimates, resulting in a consilience of inductions. At other times they fail to agree, either because fossils and molecules are giving temporal information about different aspects of nature and should not be expected to agree, or because of flawed assumptions that give rise to an inaccurate estimate. I argue that in general, despite the fact that it can be difficult to integrate disparate kinds of evidence, the principle of total evidence should be applied to the dating of evolutionary events. As a historical science, paleontology studies past events we cannot observe directly. This raises questions of epistemic access, meaning that due to the fragmentary nature of the fossil record we may find ourselves without access to the relevant traces to adjudicate between rival hypotheses about the past. The problems and prospects of epistemic access are explored through a case study of the reconstruction of the colors of dinosaurs. The paper closes with a reflection on the Darwin- Lyell metaphor of the fossil record as a highly fragmentary history book, and a call for a reconsideration of the book metaphor in favor of a systems view of the geologic and fossil records. (shrink)
Rolling Stones guitarist Keith Richards has argued that rock and roll happens from the neck down. In this contribution to The Rolling Stones and Philosophy, edited by Luke Dick and George Reisch, I draw on neuroscience to argue that, in the parlance of John Stuart Mill, rock and roll is both a higher and a lower pleasure.
The exploration of popular culture topics by academic philosophers for non-academic audiences has given rise to a distinctive genre of philosophical writing. Edited volumes with titles such as Black Sabbath and Philosophy or Buffy the Vampire Slayer and Philosophy contain chapters by multiple philosophical authors that attempt to bring philosophy to popular audiences. Two dominant models have emerged in the genre. On the pedagogical model, authors use popular culture examples to teach the reader philosophy. The end is to promote philosophical (...) literacy, defined as acquaintance with the key problems, ideas, and figures in the history of philosophy. In contrast, on the applied philosophy model, authors use philosophy to open up new dimensions of the popular culture topic for fans. The end is to illustrate the value of philosophy in understanding the popular culture topic, and ultimately, to demonstrate the value of philosophy in general. Taking stock of the relative strengths and weaknesses of these two models provides an opportunity to reflect more broadly on whether, why, and how philosophers should engage the public. (shrink)
Musiker, Freigeist, Drogensüchtiger, Stimme der Schwachen und Entrechteten, Christ, Familienmensch, Sänger von Liebe, Gott und Mord. Viel gibt es über Johnny Cash zu sagen, man kann es aber auch auf eine einfache Formel bringen: Er war der "Man in Black". Er sang vor Schwerverbrechern in Saint-Quentin, für Richard Nixon und alle amerikanischen Präsidenten nach ihm. Er verzweifelte an der Liebe zu June Carter und fand später mit ihr seine Erfüllung. Er förderte Musiker wie Bob Dylan und Kris Kristofferson, spiele absichtlich (...) schlechte Schaltplatten ein und wurde in den 1990gern von einer neuen Generation entdeckt. Was steht hinter dieser Karriere, was bewegte diesen Mann, wie sind seine Lieder und sein Weltbild zu verstehen? 21 Philosophen betrachten das Leben und Wirken dieser amerikanischen Ikone und zeigen Ihnen so eine Welt zwischen Kunst, Kommerz und Kant. (shrink)
The introduction of computer simulation to paleobiology ushered in a new, experimental style of reasoning. Rather than starting with observed fossil patterns and hypothesizing causal processes that may have produced them, it became possible to start with a process model, and from it to simulate a range of possible patterns. ;The MBL Model is a stochastic model of phylogenetic evolution . Computer simulations conducted with the MBL Model served as thought experiments in stochastic evolution. In the MBL work, similarities between (...) empirical and stochastically simulated clades mounted a visual argument that stochastic processes potentially explained large fluctuations in diversity. Stanley and others countered that the similarities were an artifact of scaling. The scaling problem may have been obscured by visual bias: measures of clade shape are scale-invariant, but diversification itself is highly scale-dependent. ;Null and neutral models are frequently conflated. Null models generate hypothetical data distributions under conditions that exclude some process of interest to test whether patterns in an actual data distribution provide statistical evidence for that process. A neutral model assumes selective equivalence among all units at a specified hierarchical level in the evolving system. Neutral models are often inappropriate as null models. Pattern recognition presupposes a null model, but tacit null models are subject to persistent cognitive biases, necessitating explicit formulation of null models. Nonetheless, failure to reject the null model should not preclude further investigation of a pattern. ;Often, models are not candidates for falsification, but serve to simulate data for testing concepts or methods. Sepkoski and Kendrick , and Robeck, Maley, and Donoghue used the MBL Model to test whether mass extinction periodicity is an artifact of paraphyly in taxonomic data. Such numerical experiments, which make all assumptions explicit, can serve as templates for localizing points of disagreement. Models may fall into lineages, each generation providing lessons for the next, but often sharing operational assumptions. Artifacts of shared operational assumptions are often a threat to the validity of inferences. Results holding across models should not be accepted unless they withstand rigorous robustness analysis. (shrink)
One of us is in translation and interpreting studies, and has written on collaborative translation, and the other in philosophy of medicine. In our open peer commentary, we will focus attention on...