It is far from obvious that outside of highly specialized domains such as commercial agriculture, the methodology of biometrics—quantitative comparisons over groups of organisms—should be of any use in today’s bioinformatically informed biological sciences. The methods in our biometric textbooks, such as regressions and principal components analysis, make assumptions of homogeneity that are incompatible with current understandings of the origins of developmental or evolutionary data in historically contingent processes, processes that might have come out otherwise; the appropriate statistical methods are (...) those suited to random walks, not normal distributions. A valid methodology would further require that especially close attention be paid to the difference between aspects of processes that are plastic, those that encode their own histories or biographies, and the very small fraction of quantifications that can usefully and realistically be modeled as varying by colored noise around a central tendency that itself has some quantitative meaning. This point of view—that only a vanishingly small fraction of the quantitative information borne by any living organism is worth quantifying—is illustrated by some data on a human birth defect, namely, fetal alcohol syndrome. In a suggestive metaphor, the biometrician is like the pilgrim in Friedrich’s painting Der Wanderer über dem Nebelmeer, uncertain as to whether to measure the mountains or the clouds. The mountains stand for contingent history, the clouds for the subset of the data most closely matching controlled experiments suitable for quantitative biometric summary. Biometrics applies to the clouds, not the mountains. The success of statistical methods comes at the expense of all the theories that we simultaneously hold to be true about the biological materials to which they both pertain. Biometrics is thus complementary to all of the emerging reductionist sciences of biological structure. (shrink)
Several disciplines share an interest in the evolutionary selection pressures that shaped human physical functioning and appearance, psyche, and behavior. The methodologies invoked from the disciplines studying these domains are often based on different rhetorics, and hence may conflict. Progress in one field is thereby hampered from effective transfer to others. Topics at the intersection of anthropometry and psychometry, such as the impact of sexual selection on the hominin face, are a typical example. Since the underlying theory explicitly places facial (...) form in the middle of a causal chain as the mediating variable between biological causes and psychological effects, a particularly convenient conceptual and analytic scenario arises as follows. Modern morphometrics allows analysis of shape both “backwards” and “forwards” . The two computations are commensurate, hence the two kinds of effects can be compared and evaluated as directions in the same morphospace. We suggest translating the morphometric methodology of “Darwinian aesthetics” into this space, where psychological and other processes of interest can be coded commensurately. Such a translation permits researchers to relate the effects of biological processes on form to the perceptions of the same processes in one unified “psychomorphospace.”. (shrink)
Although Harry Woolf’s great collective volume Quantification mostly overlooked biology, Thomas Kuhn’s chapter there on the role of quantitative measurement within the physical sciences maps quite well onto the forms of reasoning that actually persuade us as biologists 50 years later. Kuhn distinguished between two contexts, that of producing quantitative anomalies and that of resolving them. The implied form of reasoning is actually C. S. Peirce’s abduction or inference to the best explanation: “The surprising fact C is observed; but if (...) A were true, C would be a matter of course; hence there is reason to suspect that A is true.” This article reviews abduction and the Kuhnian dichotomy in a range of classic examples where quantitative reasoning has ended arguments in the natural sciences. Included are John Snow’s discovery of the cause of cholera, Jean Perrin’s proof that atoms exist, the discovery of the double helix, the Alvarezes’ explanation of the extraterrestrial origin of the Cretaceous-Tertiary extinction, and current examples in passive smoking, ulcers, and the anthropogenicity of global warming. Modern biology is a quantitative science to the extent that we operate by “strong inference,” the insistence that our data are surprising on everybody else’s hypotheses but follow as a matter of course from our own, and that we demand numerical consilience whenever we infer across levels of analysis or across disciplines. (shrink)
Fetal alcohol syndrome , the most common avoidable human birth defect, is the extensive irreversible brain damage caused by heavy prenatal alcohol exposure. Following the discovery of FAS in 1973, a multidisciplinary research community began applying discipline-specific methods to investigate the mechanisms underlying FAS and its consequences for the victims’ cognition and social behavior. An academic biomathematician and statistician, since 1984 I have collaborated with one American research group studying this condition.
The current literature that attempts to bridge between geometric morphometrics (GMM) and finite element analyses (FEA) of CT-derived data from bones of living animals and fossils appears to lack a sound biotheoretical foundation. To supply the missing rigor, the present article demonstrates a new rhetoric of quantitative inference across the GMM–FEA bridge—a rhetoric bridging form to function when both have been quantified so stringently. The suggested approach is founded on diverse standard textbook examples of the relation between forms and the (...) way strains in them are produced by stresses imposed upon them. One potentially cogent approach to the explanatory purposes driving studies of this class arises from a close scrutiny of the way in which computations in both domains, shape and strain, can be couched as minimizations of a scalar quantity. For GMM, this is ordinary Procrustes shape distance; in FEA, it is the potential energy that is stored in the deformed configuration of the solid form. A hybrid statistical method is introduced requiring that all forms be subjected to the same detailed loading designs (the same “probes”) in a manner careful to accommodate the variations of those same forms before they were stressed. The proper role of GMM is argued to be the construction of regressions for strain energy density on the largest-scale relative warps in order that biological explanations may proceed in terms of the residuals from those regressions: the local residual features of strain energy density. The method, evidently a hierarchical one, might be intuitively apprehended as a geometrical approach to a formal allometric analysis of strain. The essay closes with an exhortation. (shrink)
Complexity in our universe, Herbert Simon once noted, generally takes a hierarchical, nearly decomposable form. If our purpose as biologists is to "carve Nature at the joints," then the quantitative biologist's pattern questions must embody some tentative claim of where the explanatory joints are—only after meaningful qualifications can notions of variance and covariance make sense. In morphometrics, specimens and variables alike can be "carved at the joints," with a correspondingly great gain in explanatory power in both versions. Simon's advice is (...) that the competent biologist's measurements should lie entirely within a single organismal component or else deal entirely with one of the joints. In either context, our best contemporary rhetorics of explanation in biology may resemble morphometrics in their frank combination of carefully supervised parallel quantifications that, taken together, result in new qualifications, leading in turn to new quantifications, and so on. In short, the relation between qualitative and quantitative in the organismal biological sciences is not an opposition but a complementarity, and the modern biometrical statistics of organismal form may be a particularly apposite praxis for exploring it. (shrink)
Back in 1987 the physicist/theoretical biologist Walter Elsasser reviewed a range of philosophical issues at the foundation of organismal biology above the molecular level. Two of these are particularly relevant to quantifications of form: the concept of ordered heterogeneity and the principle of nonstructural memory, the truism that typically the forms of organisms substantially resemble the forms of their ancestors. This essay attempts to weave Elsasser’s principles together with morphometrics for one prominent data type, the representation of animal forms by (...) positions of landmark points. I introduce a new spectrum of pattern descriptors of the shape variation of landmark configurations, ranging from the most homogeneous through growth gradients and focal features to a model of totally disordered heterogeneity incompatible with the rhetoric of functional explanation. An investigation may end up reporting its findings by one of these descriptors or by several. These descriptors all derive from one shared diagrammatic device: a log–log plot of sample variance against one standard formalism of today’s geometric morphometrics, the bending energies of the principal warps that represent all the scales of variability around the sample average shape. The new descriptors, which I demonstrate over a variety of contemporary morphometric examples, may help build the bridge we urgently need between the arithmetic of today’s burgeoning image-based data resources and the rhetoric of biological explanations aligned with the principles of Elsasser along with an even earlier philosopher of biology, the Viennese visionary Hans Przibram. (shrink)
Chow sets his version of statistical significance testing in an impoverished context of “theory corroboration” that explicitly excludes well-posed theories admitting of strong support by precise empirical evidence. He demonstrates no scientific usefulness for the problematic procedure he recommends instead. The important role played by significance testing in today's behavioral and brain sciences is wholly inconsistent with the rhetoric he would enforce.
Neural organization attempts to thwart, at least in part, modern neuroscientists' tendency to focus reductionistically on ever smaller microsystems. But although emphasizing higher levels of systems organization, the authors end up enforcing reductionisms of their own, principally the reduction of their domain to the study of invariable normal functioning, without explicit modeling of the deviations that constitute disease states or aging. This reductionism seriously weakens the authors' claims about the truth of their quantitative models.