It is argued that Husserl was an “externalist” in at least one sense. For it is argued that Husserl held that genuinely perceptual experiences—that is to say, experiences that are of some real object in the world—differ intrinsically, essentially and as a kind from any hallucinatory experiences. There is, therefore, no neutral “content” that such perceptual experiences share with hallucinations, differing from them only over whether some additional non-psychological condition holds or not. In short, it is argued that Husserl was (...) a “disjunctivist”. In addition, it is argued that Husserl held that the individual object of any experience, perceptual or hallucinatory, is essential to and partly constitutive of that experience. The argument focuses on three aspects of Husserl’s thought: his account of intentional objects, his notion of horizon, and his account of reality. (shrink)
What follows is a first step towards an ontology of conscious mental processes. We provide a theoretical foundation and characterization of conscious mental processes based on a realist theory of intentionality and using BFO as our top-level ontology. We distinguish three components of intentional mental process: character, directedness, and objective referent, and describe several features of the process character and directedness significant to defining and classifying mental processes. We arrive at the definition of representational mental process as a process that (...) is the bringing into being, sustaining, modifying, or terminating of a mental representation. We conclude by outlining some benefits and applications of this approach. (shrink)
The authors outline the way in which documents as social objects have evolved from their earliest forms to the electronic documents of the present day. They note that while certain features have remained consistent, processes regarding document authentication are seriously complicated by the easy reproducibility of digital entities. The authors argue that electronic documents also raise significant questions concerning the theory of ‘documentality’ advanced by Maurizio Ferraris, especially given the fact that interactive documents seem to blur the distinctions between the (...) static documents (or ‘inscriptions’) which form Ferraris’s starting point, and dynamic software processes. The authors argue further that the Ferraris view as applied to legal documents is flawed because of the fact that courts may treat contractual obligations as enduring even in spite of a complete absence of enduring inscriptions. Finally, the authors note that traces in brains, another important family of inscriptions (as Ferraris conceives them), differ significantly from genuinely documentary inscriptions by their lack of public inspectability. (shrink)
We propose as a UCGIS research priority the topic of “Ontological Foundations for Geographic Information.” Under this umbrella we unify several interrelated research subfields, each of which deals with different perspectives on geospatial ontologies and their roles in geographic information science. While each of these subfields could be addressed separately, we believe it is important to address ontological research in a unitary, systematic fashion, embracing conceptual issues concerning what would be required to establish an exhaustive ontology of the geospatial domain, (...) issues relating to the choice of appropriate methods for formalizing ontologies, and considerations regarding the design of ontology-driven information systems. This integrated approach is necessary, because there is a strong dependency between the methods used to specify an ontology, and the conceptual richness, robustness and tractability of the ontology itself. Likewise, information system implementations are needed as testbeds of the usefulness of every aspect of an exhaustive ontology of the geospatial domain. None of the current UCGIS research priorities provides such an integrative perspective, and therefore the topic of “Ontological Foundations for Geographic Information Science” is unique. (shrink)
This book evaluates the increasingly wide variety of intellectual resources for research methods and methodologies and investigates what constitutes good educational research. Written by a distinguished international group of philosophers of education Questions what sorts of research can usefully inform policy and practice, and what inferences can be drawn from different kinds of research Demonstrates the critical engagement of philosophers of education with the wider educational research community and illustrates the benefits that can accrue from such engagement.
Standardizing consultation processes is increasingly important as clinical ethics consultation becomes more utilized in and vital to medical practice. Solid organ transplant represents a relatively nascent field replete with complex ethical issues that, while explored, have not been systematically classified. In this paper, we offer a proposed taxonomy that divides issues of resource allocation from viable solutions to the issue of organ shortage in transplant and then further distinguishes between policy and bedside level issues. We then identify all transplant related (...) ethics consults performed at the Cleveland Clinic between 2008 and 2013 in order to identify how consultants conceptually framed their consultations by the domains they ascribe to the case. We code the CC domains to those in the Core Competencies for Healthcare Consultation Ethics in order to initiate a broader conversation regarding best practices in these highly complex cases. A discussion of the ethical issues underlying living donor and recipient related consults ensues. Finally, we suggest that the ethical domains prescribed in the Core Competencies provide a strong starting ground for a common intra-disciplinary language in the realm of formal CEC. (shrink)
This book evaluates the increasingly wide variety of intellectual resources for research methods and methodologies and investigates what constitutes good educational research. Written by a distinguished international group of philosophers of education Questions what sorts of research can usefully inform policy and practice, and what inferences can be drawn from different kinds of research Demonstrates the critical engagement of philosophers of education with the wider educational research community and illustrates the benefits that can accrue from such engagement.
The “basic debate” in business ethics between shareholder theory and stakeholder theory has underlined the field since its inception, with wide ranging normative, descriptive, and instrumental arguments offered on both sides. We maintain that insofar as this is primarily a normative debate, clarity can be brought by elucidating how it is framed by the political philosophies of liberalism and libertarianism.With liberalism represented by John Rawls’s theory of justice and libertarianism represented by the ideas of Milton Friedman and Robert Nozick, and (...) Edward Freeman, the paper shows that both liberalism and libertarianism can be interpreted to justify shareholder and stakeholder theory respectively. The debate between shareholder theory and stakeholder theory is framed by liberal and libertarian justifications that hinge primarily on whether and to what extent one should have exogenous or endogenous safeguards on corporate behavior. Accordingly, political philosophy turns out to be highly relevant to both business ethics and corporate governance, not because the corporation resembles the state, but because of the potential safeguards placed on the corporation by the state. (shrink)
The “basic debate” in business ethics between shareholder theory and stakeholder theory has underlined the field since its inception, with wide ranging normative, descriptive, and instrumental arguments offered on both sides. We maintain that insofar as this is primarily a normative debate, clarity can be brought by elucidating how it is framed by the political philosophies of liberalism and libertarianism.With liberalism represented by John Rawls’s theory of justice and libertarianism represented by the ideas of Milton Friedman and Robert Nozick, and (...) Edward Freeman, the paper shows that both liberalism and libertarianism can be interpreted to justify shareholder and stakeholder theory respectively. The debate between shareholder theory and stakeholder theory is framed by liberal and libertarian justifications that hinge primarily on whether and to what extent one should have exogenous or endogenous safeguards on corporate behavior. Accordingly, political philosophy turns out to be highly relevant to both business ethics and corporate governance, not because the corporation resembles the state, but because of the potential safeguards placed on the corporation by the state. (shrink)
Despite continuing controversies regarding the vital status of both brain-dead donors and individuals who undergo donation after circulatory death (DCD), respecting the dead donor rule (DDR) remains the standard moral framework for organ procurement. The DDR increases organ supply without jeopardizing trust in transplantation systems, reassuring society that donors will not experience harm during organ procurement. While the assumption that individuals cannot be harmed once they are dead is reasonable in the case of brain-dead protocols, we argue that the DDR (...) is not an acceptable strategy to protect donors from harm in DCD protocols. We propose a threefold alternative to justify organ procurement practices: (1) ensuring that donors are sufficiently protected from harm; (2) ensuring that they are respected through informed consent; and (3) ensuring that society is fully informed of the inherently debatable nature of any criterion to declare death. (shrink)
This volume presents a many-faceted view of the great Oxford philosopher R. G. Collingwood. At its centre is his Autobiography of 1939, a cult classic for its compelling 'story of his thought'. That work is accompanied here by previously unpublished writings by Collingwood and eleven specially written essays on aspects of his life and work.
In the present review we focus on what we take to be some remaining issues with the Behaviour Change Intervention Ontology (BCIO). We are in full agreement with the authors’ endorsement of the principles of best practice for ontology development In particular, we agree that an ontology should be “logically consistent and having a clear structures [sic], preferably a well-organised hierarchical structure,” and that “Maximising the new ontology’s interoperability with existing ontologies by reusing entities from existing ontologies where appropriate” is (...) critically important (Wright et al., 2020, p. 17). Our remaining concerns with BCIO relate directly to these two principles. First, we identify a number of issues with some of the classifications and definitions in BCIO that seem to be in tension with the just-mentioned principle . Second, we note some reservations about the reuse of certain classes in BCIO, namely from the Gazetteer (GAZ), the Ontology of Medically Related Social Entities (OMRSE), and the Information Artifact Ontology (IAO). While the latter principle of “reuse” is important, it is also important not to let the reuse of existing classes (or their corresponding definitions) compromise the logical integrity or the realist nature of one’s ontology. (shrink)
In “Ontologies Relevant to behaviour change interventions: A Method for their Development” Wright, et al. outline a step by step process for building ontologies of behaviour modification – what the authors call the Refined Ontology Developmental Method (RODM) – and demonstrate its use in the development of the Behaviour Change Intervention Ontology (BCIO). RODM is based on the principles of good ontology building used by the Open Biomedical Ontology (OBO) Foundry in addition to those outlined in (Arp, Smith, and (...) Spear 2015). BCIO uses as its top-level ontology Basic Formal Ontology (BFO). The methods outlined in Wright, et al. are a valuable contribution to the field, especially the use of formal mechanisms for literature annotation and expert stakeholder review, and the BCIO will certainly play an important role in the extension of OBO Foundry ontologies into the behavioural domain. (shrink)
In this stimulating introduction, David Woodruff Smith introduces the whole of Husserl’s thought, demonstrating his influence on philosophy of mind and language, on ontology and epistemology, and on philosophy of logic, mathematics and science. Starting with an overview of his life and works, and his place in twentieth-century philosophy, and in western philosophy as a whole, David Woodruff Smith introduces Husserl’s concept of phenomenology, explaining his influential theories of intentionality, objectivity and subjectivity. In subsequent chapters he (...) covers Husserl’s logic, metaphysics, realism and transcendental idealism, and epistemology. Finally, he assesses the significance and implications of Husserl’s work for contemporary philosophy of mind and cognitive science. Including a timeline, glossary and extensive suggestions for further reading, _Husserl_ is essential reading for anyone interested in this eminent philosopher, phenomenology or twentieth-century philosophy. (shrink)
Applied ontologies have been used more and more frequently to enhance systems engineering. In this paper, we argue that adopting principles of ontological realism can increase the benefits that ontologies have already been shown to provide to the systems engineering process. Moreover, adopting Basic Formal Ontology (BFO), an ISO standard for top-level ontologies from which more domain specific ontologies are constructed, can lead to benefits in four distinct areas of systems engineering: (1) interoperability, (2) standardization, (3) testing, and (4) data (...) exploitation. Reaping these benefits in a model-based systems engineering (MBSE) context requires utilizing an ontology’s vocabulary when modeling systems and entities within those systems. If the chosen ontology abides by the principles of ontological realism, a semantic standard capable of uniting distinct domains, using BFO as a hub, can be leveraged to promote greater interoperability among systems. As interoperability and standardization increase, so does the ability to collect data during the testing and implementation of systems. These data can then be reasoned over by computational reasoners using the logical axioms within the ontology. This, in turn, generates new data that would have been impossible or too inefficient to generate without the aid of computational reasoners. (shrink)
How should deontological theories that prohibit actions of type K — such as intentionally killing an innocent person — deal with cases of uncertainty as to whether a particular action is of type K? Frank Jackson and Michael Smith, who raise this problem in their paper "Absolutist Moral Theories and Uncertainty" (2006), focus on a case where a skier is about to cause the death of ten innocent people — we don’t know for sure whether on purpose or not (...) — by causing an avalanche; and we can only save the people by shooting the skier. One possible deontological attitude towards such uncertainty is what Jackson and Smith call the threshold view, according to which whether or not the deontological constraint applies depends on our degree of (justified) certainty meets a given threshold. Jackson and Smith argue against the threshold view that it leads to implausible paradoxical moral dilemmas in a special kind of case. In this response, we show that the threshold view can avoid these implausible moral dilemmas, as long as the relevant deontological constraint is grounded in individualistic patient-based considerations, such as what an individual person is entitled to object to. (shrink)
In this stimulating introduction, David Woodruff Smith introduces the whole of Husserl’s thought, demonstrating his influence on philosophy of mind and language, on ontology and epistemology, and on philosophy of logic, mathematics and science. Starting with an overview of his life and works, and his place in twentieth-century philosophy, and in western philosophy as a whole, David Woodruff Smith introduces Husserl’s concept of phenomenology, explaining his influential theories of intentionality, objectivity and subjectivity. In subsequent chapters he (...) covers Husserl’s logic, metaphysics, realism and transcendental idealism, and epistemology. Finally, he assesses the significance and implications of Husserl’s work for contemporary philosophy of mind and cognitive science. Including a timeline, glossary and extensive suggestions for further reading, Husserl is essential reading for anyone interested in this eminent philosopher, phenomenology or twentieth-century philosophy. (shrink)
Combining the methods of the modern philosopher with those of the historian of ideas, Knud Haakonssen presents an interpretation of the philosophy of law which Adam Smith developed out of - and partly in response to - David Hume's theory of justice. While acknowledging that the influences on Smith were many and various, Dr Haakonssen suggests that the decisive philosophical one was Hume's analysis of justice in A Treatise of Human Nature and the second Enquiry. He therefore (...) begins with a thorough investigation of Hume, from which he goes on to show the philosophical originality of Smith's new form of natural jurisprudence. At the same time, he provides an over all reading of Smith's social and political thought, demonstrating clearly the exact links between the moral theory of The Theory of Moral Sentiments, the Lectures on Jurisprudence, and the sociohistorical theory of The Wealth of Nations. This is the first full analysis of Adam Smith's jurisprudence; it emphasizes its normative and critical function, and relates this to the psychological, sociological, and histroical aspects which hitherto have attracted most attention. Dr Haakonssen is critical of both purely descriptivist and utilitarian interpretations of Smith's moral and political philosophy, and demonstrates the implausibility of regarding Smith's view of history as pseudo-economic or 'materialist'. (shrink)
REMARKS ON EVOLUTION AND TIME-SCALES, Graham Cairns-Smith; HODGSON'S BLACK BOX, Thomas Clark; DO HODGSON'S PROPOSITIONS UNIQUELY CHARACTERIZE FREE WILL?, Ravi Gomatam; WHAT SHOULD WE RETAIN FROM A PLAIN PERSON'S CONCEPT OF FREE WILL?, Gilberto Gomes; ISOLATING DISPARATE CHALLENGES TO HODGSON'S ACCOUNT OF FREE WILL, Liberty Jaswal; FREE AGENCY AND LAWS OF NATURE, Robert Kane; SCIENCE VERSUS REALIZATION OF VALUE, NOT DETERMINISM VERSUS CHOICE, Nicholas Maxwell; COMMENTS ON HODGSON, J.J.C. Smart; THE VIEW FROM WITHIN, Sean Spence; COMMENTARY ON HODGSON, Henry (...) Stapp. (shrink)
Following a hypnotic amnesia suggestion, highly hypnotically suggestible subjects may experience amnesia for events. Is there a failure to retrieve the material concerned from autobiographical memory, or is it retrieved but blocked from consciousness? Highly hypnotically suggestible subjects produced free-associates to a list of concrete nouns. They were then given an amnesia suggestion for that episode followed by another free association list, which included 15 critical words that had been previously presented. If episodic retrieval for the first trial had been (...) blocked, the responses on the second trial should still have been at least as fast as for the first trial. With semantic priming, they should be faster. In fact, they were on average half a second slower. This suggests that the material had been retrieved but blocked from consciousness. A goal-oriented information processing framework is outlined to interpret these and related data. (shrink)
Norman Kemp Smith's The Philosophy of David Hume continues to be unsurpassed in its comprehensive coverage of the ideas and issues of Hume's Treatise. Now, after years of waiting, this currently out-of-print and highly sought-after classic is being re-issued. This ground-breaking book has long been regarded as a classic study by scholars in the field, yet a new introduction by Don Garrett places the book in its contemporary context, showing Humes's continuing importance in the field.
The computational genomics community has come increasingly to rely on the methodology of creating annotations of scientific literature using terms from controlled structured vocabularies such as the Gene Ontology (GO). We here address the question of what such annotations signify and of how they are created by working biologists. Our goal is to promote a better understanding of how the results of experiments are captured in annotations in the hope that this will lead to better representations of biological reality through (...) both the annotation process and ontology development, and in more informed use of the GO resources by experimental scientists. (shrink)
Although living conditions have improved throughout history, protest, at least in the last few decades, seems to have increased to the point of becoming a normal phenomenon in modern societies. Contributors to this volume examine how and why this is the case and argue that although problems such as poverty, hunger, and violations of democratic rights may have been reduced in advanced Western societies, a variety of other problems and opportunities have emerged and multiplied the reasons and possibilities for protest.
There was a strong consensus in the commentaries that animals' performances in metacognition paradigms indicate high-level decisional processes that cannot be explained associatively. Our response summarizes this consensus and the support for the idea that these performances demonstrate animal metacognition. We amplify the idea that there is an adaptive advantage favoring animals who can – in an immediate moment of difficulty or uncertainty – construct a decisional assemblage that lets them find an appropriate behavioral solution. A working consciousness would serve (...) this function well. This explains why animals may have the functional equivalent of human declarative consciousness. However, like other commentators who were friendly to this equivalence, we approach carefully the stronger claims that animals' metacognitive performances imply full-blown self-awareness or phenomenal consciousness. We discuss the commentators' interesting ideas for future research, as well as their intriguing ideas about the evolution and development of metacognition and its relation to theory of mind. We also discuss residual confusions about existing research and remaining methodological issues. (shrink)
Biologists and philosophers differ on whether selection should be analyzed at the level of the gene or of the individual. In Peter Godfrey-Smith’s book, Darwinian Populations and Natural Selection, he argues that individuals can be good members of Darwinian populations, whereas genes rarely can. I take issue with parts of this view, and suggest that Godfrey-Smith’s scheme for thinking about Darwinian populations is also applicable to populations of genes.
The United Kingdom Climate Impacts Programme’s UKCP09 project makes high-resolution projections of the climate out to 2100 by post-processing the outputs of a large-scale global climate model. The aim of this paper is to describe and analyse the methodology used and then urge some caution. Given the acknowledged systematic, shared errors of all current climate models, treating model outputs as decision-relevant projections can be significantly misleading. In extrapolatory situations, such as projections of future climate change, there is little reason to (...) expect that post-processing of model outputs can correct for the consequences of such errors. This casts doubt on our ability, today, to make trustworthy probabilistic projections at high resolution out to the end of the century. (shrink)
The United Kingdom Climate Impacts Program’s UKCP09 project makes high-resolution forecasts of climate during the 21st century using state of the art global climate models. The aim of this paper is to introduce and analyze the methodology used and then urge some caution. Given the acknowledged systematic errors in all current climate models, treating model outputs as decision relevant probabilistic forecasts can be seriously misleading. This casts doubt on our ability, today, to make trustworthy, high-resolution predictions out to the end (...) of this century. (shrink)
Pilot Valley, located in the eastern Basin and Range, Western Utah, USA, contains numerous shorelines and depositional remnants of Late Pleistocene Lake Bonneville. These remnants present excellent ground-penetrating radar targets due to their coherent stratification, low-clay, low-salinity, and low moisture content. Three-dimensional GPR imaging can resolve fine-scale stratigraphy of these deposits down to a few centimeters, and when combined with detailed outcrop characterization, it provides an in-depth look at the architecture of these deposits. On the western side of Pilot Valley, (...) a well-preserved late Pleistocene gravel bar records shoreline depositional processes associated with the Provo shoreline period. GPR data, measured stratigraphic sections, cores, paleontological sampling for paleoecology and radiocarbon dating, and mineralogical analysis permit a detailed reconstruction of the depositional environment of this well-exposed prograding gravel bar. Contrary to other described Bonneville shoreline deposits, calibrated radiocarbon ages ranging from 16.5 to 14.3 indicate that the bar was stable and active during an overall regressive stage of the lake, as it dropped from the Provo shoreline. Our study provides a model for an ancient pluvial lakeshore depositional environment in the Basin and Range province and suggests that stable, progradational bedforms common to the various stages of Lake Bonneville are likely not all associated with periods of shoreline stability, as is commonly assumed. The high-resolution GPR visualization demonstrates the high degree of compartmentalization possible for a potential subsurface reservoir target based on ancient shoreline sedimentary facies. (shrink)
We have synthesized a 582,970-base pair Mycoplasma genitalium genome. This synthetic genome, named M. genitalium JCVI-1.0, contains all the genes of wild-type M. genitalium G37 except MG408, which was disrupted by an antibiotic marker to block pathogenicity and to allow for selection. To identify the genome as synthetic, we inserted "watermarks" at intergenic sites known to tolerate transposon insertions. Overlapping "cassettes" of 5 to 7 kilobases (kb), assembled from chemically synthesized oligonucleotides, were joined by in vitro recombination to produce intermediate (...) assemblies of approximately 24 kb, 72 kb ("1/8 genome"), and 144 kb ("1/4 genome"), which were all cloned as bacterial artificial chromosomes in Escherichia coli. Most of these intermediate clones were sequenced, and clones of all four 1/4 genomes with the correct sequence were identified. The complete synthetic genome was assembled by transformation-associated recombination cloning in the yeast Saccharomyces cerevisiae, then isolated and sequenced. A clone with the correct sequence was identified. The methods described here will be generally useful for constructing large DNA molecules from chemically synthesized pieces and also from combinations of natural and synthetic DNA segments. 10.1126/science.1151721. (shrink)