Theories of spatial cognition are derived from many sources. Psychologists are concerned with determining the features of the mind which, in combination with external inputs, produce our spatialized experience. A review of philosophical and other approaches has convinced us that the brain must come equipped to impose a three-dimensional Euclidean framework on experience – our analysis suggests that object re-identification may require such a framework. We identify this absolute, nonegocentric, spatial framework with a specific neural system centered in the hippocampus.A (...) consideration of the kinds of behaviours in which such a spatial mapping system would be important is followed by an analysis of the anatomy and physiology of this system, with special emphasis on the place-coded neurons recorded in the hippocampus of freely moving rats. A tentative physiological model for the hippocampal cognitive map is proposed. A review of lesion studies, in tasks as diverse as discrimination learning, avoidance, and extinction, shows that the cognitive map notion can adequately explain much of the data.The model is extended to humans by the assumption that spatial maps are built in one hemisphere, semantic maps in the other. The latter provide a semantic deep structure within which discourse comprehension and production can be achieved. Evidence from the study of amnesic patients, briefly reviewed, is consistent with this extension. (shrink)
This Neurocomputing special issue is based on selected, expanded and signiﬁcantly revised versions of papers presented at the Second International Conference on Brain Inspired Cognitive Systems (BICS 2006) held at Lesvos, Greece, from 10 to 14 October 2006. The aim of BICS 2006, which followed the very successful ﬁrst BICS 2004 held at Stirling, Scotland, was to bring together leading scientists and engineers who use analytic, syntactic and computational methods both to understand the prodigious processing properties of biological systems and, (...) speciﬁcally, of the brain, and to exploit such knowledge to advance computational methods towards ever higher levels of cognitive competence. The biennial BICS Conference Series (with BICS 2008 recently held in Sao Luis, Brazil, 24–27 June, and BICS 2010 due to be held in Madrid, Spain) aims to become a major point of contact for research scientists, engineers and practitioners throughout the world in the ﬁelds of cognitive and computational systems inspired by the brain and biology. The ﬁrst paper in this special issue is by Carnell who presents an analysis of the use of Hebbian and Anti-Hebbian spike timedependent plasticity (STDP) learning functions within the context of recurrent spiking neural networks. He shows that under speciﬁc conditions Hebbian and Anti-Hebbian learning can be considered approximately equivalent. Finally, the author demonstrates that such a network habituates to a given stimulus and is capable of detecting subtle variations in the structure of the stimuli itself. Hodge, O’Keefe and Austin present a binary neural shape matcher using Johnson counters and chain codes. They show that images may be matched as whole images or using shape matching. Finally, they demonstrate shape matching using a binary associative-memory neural network to index and match chain codes where the chain code elements are represented by Johnson codes. (shrink)
Most expressions in natural language are vague. But what is the best semantic treatment of terms like 'heap', 'red' and 'child'? And what is the logic of arguments involving this kind of vague expression? These questions are receiving increasing philosophical attention, and in this book, first published in 2000, Rosanna Keefe explores the questions of what we should want from an account of vagueness and how we should assess rival theories. Her discussion ranges widely and comprehensively over the main (...) theories of vagueness and their supporting arguments, and she offers a powerful and original defence of a form of supervaluationism, a theory that requires almost no deviation from standard logic yet can accommodate the lack of sharp boundaries to vague predicates and deal with the paradoxes of vagueness in a methodologically satisfying way. Her study will be of particular interest to readers in philosophy of language and of mind, philosophical logic, epistemology and metaphysics. (shrink)
This book is the one to put into the hands of those who have been over-impressed by Austin 's critics....[Warnock's] brilliant editing puts everybody who is concerned with philosophical problems in his debt.
The influence of J. L. Austin on contemporary philosophy was substantial during his lifetime, and has grown greatly since his death, at the height of his powers, in 1960. Philosophical Papers, first published in 1961, was the first of three volumes of Austin's work to be edited by J. O. Urmson and G. J. Warnock. Together with Sense and Sensibilia and How to do things with Words, it has extended Austin's influence far beyond the circle who knew (...) him or read the handful of papers he published in journals. (shrink)
This work has been selected by scholars as being culturally important and is part of the knowledge base of civilization as we know it. This work is in the public domain in the United States of America, and possibly other nations. Within the United States, you may freely copy and distribute this work, as no entity (individual or corporate) has a copyright on the body of the work. Scholars believe, and we concur, that this work is important enough to be (...) preserved, reproduced, and made generally available to the public. To ensure a quality reading experience, this work has been proofread and republished using a format that seamlessly blends the original graphical elements with text in an easy-to-read typeface. We appreciate your support of the preservation process, and thank you for being an important part of keeping this knowledge alive and relevant. (shrink)
The Province of Jurisprudence Determined (1832) is a classic of nineteenth-century English jurisprudence, a subject on which Austin had a profound impact. His book is primarily concerned with a meticulous explanation of most of the core concepts of his legal philosophy, including his conception of law, his separation of law and morality, and his theory of sovereignty. Almost a quarter of it consists of an interpretation and defence of the principle of utility. This edition includes the complete and unabridged (...) text of the fifth (1885) and last edition. The comprehensive introduction discusses Austin's life, the main themes of his book, leading criticisms of his ideas, and recent interpretations of his legal philosophy. The edition also includes an up-to-date bibliography and biographical synopses of the principal figures mentioned in the text. (shrink)
This book offers a novel defence of a highly contested philosophical position: biological natural kind essentialism. This theory is routinely and explicitly rejected for its purported inability to be explicated in the context of contemporary biological science, and its supposed incompatibility with the process and progress of evolution by natural selection. Christopher J. Austin challenges these objections, and in conjunction with contemporary scientific advancements within the field of evolutionary-developmental biology, the book utilises a contemporary neo-Aristotelian metaphysics of "dispositional properties", (...) or causal powers, to provide a theory of essentialism centred on the developmental architecture of organisms and its role in the evolutionary process. By defending a novel theory of Aristotelian biological natural kind essentialism, Essence in the Age of Evolution represents the fresh and exciting union of cutting-edge philosophical insight and scientific knowledge. (shrink)
Law and the Humanities: An Introduction brings together a distinguished group of scholars from law schools and an array of the disciplines in the humanities. Contributors come from the United States and abroad in recognition of the global reach of this field. This book is, at one and the same time, a stock taking both of different national traditions and of the various modes and subjects of law and humanities scholarship. It is also an effort to chart future directions for (...) the field. By reviewing and analyzing existing scholarship and providing thematic content and distinctive arguments, it offers to its readers both a resource and a provocation. Thus, Law and the Humanities marks the maturation of this 'law and' enterprise and will spur its further development. (shrink)
_The Foundations of Arithmetic_ is undoubtedly the best introduction to Frege's thought; it is here that Frege expounds the central notions of his philosophy, subjecting the views of his predecessors and contemporaries to devastating analysis. The book represents the first philosophically sound discussion of the concept of number in Western civilization. It profoundly influenced developments in the philosophy of mathematics and in general ontology.
This sequel to the widely read Zen and the Brain continues James Austin's explorations into the key interrelationships between Zen Buddhism and brain research. In Zen-Brain Reflections, Austin, a clinical neurologist, researcher, and Zen practitioner, examines the evolving psychological processes and brain changes associated with the path of long-range meditative training. Austin draws not only on the latest neuroscience research and new neuroimaging studies but also on Zen literature and his personal experience with alternate states of consciousness.Zen-Brain (...) Reflections takes up where the earlier book left off. It addresses such questions as: how do placebos and acupuncture change the brain? Can neuroimaging studies localize the sites where our notions of self arise? How can the latest brain imaging methods monitor meditators more effectively? How do long years of meditative training plus brief enlightened states produce pivotal transformations in the physiology of the brain? In many chapters testable hypotheses suggest ways to correlate normal brain functions and meditative training with the phenomena of extraordinary states of consciousness.After briefly introducing the topic of Zen and describing recent research into meditation, Austin reviews the latest studies on the amygdala, frontotemporal interactions, and paralimbic extensions of the limbic system. He then explores different states of consciousness, both the early superficial absorptions and the later, major "peak experiences." This discussion begins with the states called kensho and satori and includes a fresh analysis of their several different expressions of "oneness." He points beyond the still more advanced states toward that rare ongoing stage of enlightenment that is manifest as "sage wisdom."Finally, with reference to a delayed "moonlight" phase of kensho, Austin envisions novel links between migraines and metaphors, moonlight and mysticism. The Zen perspective on the self and consciousness is an ancient one. Readers will discover how relevant Zen is to the neurosciences, and how each field can illuminate the other. (shrink)
In Austin's Way with Skepticism, Mark Kaplan argues that J. L Austin's 'ordinary language' approach to epistemological problems has been misread. Contrary to the consensus view, Kaplan presents Austin's methods as both a powerful critique of the project of constructive epistemology and an appreciation of how epistemology needs to be done.
First published in 1962, contains the William James Lectures delivered at Harvard University in 1955. It sets out Austin's conclusions in the field to which he directed his main efforts for at least the last ten years of his life. Starting from an exhaustive examination of his already well- known distinction of performative utterances from statements, Austin here finally abandons that distinction, replacing it by a more general theory of 'illocutionary forces' of utterances which has important bearings on (...) a wide variety of philosophical problems. (shrink)
The subject of this paper, Excuses, is one not to be treated, but only to be introduced, within such limits. It is, or might be, the name of a whole branch, even a ramiculated branch, of philosophy, or at least of one fashion of philosophy. I shall try, therefore, first to state what the subject is, why it is worth studying, and how it may be studied, all this at a regrettably lofty level: and then I shall illustrate, in more (...) congenial but desultory detail, some of the methods to be used, together with their limitations, and some of the unexpected results to be expected and lessons to be learned. Much, of course, of the amusement, and of the instruction, comes in drawing the coverts of the microglot, in hounding down the minutiae, and to this I can do no more here than incite you. But I owe it to the subject to say, that it has long afforded me what philosophy is so often thought, and made, barren of -- the fun of discovery, the pleasures of co-operation, and the satisfaction of reaching agreement. (shrink)
This work sets out Austin's conclusions in the field to which he directed his main efforts for at least the last ten years of his life. Starting from an exhaustive examination of his already well-known distinction between performative utterances and statements, Austin here finally abandons that distinction, replacing it with a more general theory of 'illocutionary forces' of utterances which has important bearings on a wide variety of philosophical problems.
According to dispositionalism, de re modality is grounded in the intrinsic natures of dispositional properties. Those properties are able to serve as the ground of de re modal truths, it is said, because they bear a special relation to counterfactual conditionals, one of truthmaking. However, because dispositionalism purports to ground de re modality only on the intrinsic natures of dispositional properties, it had better be the case that they do not play that truthmaking role merely in virtue of their being (...) embedded in some particular, extrinsic causal context. This paper examines a recent argument against dispositionalism that purports to show that the intrinsicality of that relation cannot be maintained, due to the ceteris paribus nature of the counterfactuals that dispositions make-true. When two prominent responses are examined, both are found wanting: at best, they require unjustified special pleading, and at worst, they amount to little more than ad hoc conceptual trickery. (shrink)
Sport builds character. If this is true, why is there a consistent stream of news detailing the bad behavior of athletes? We are bombarded with accounts of elite athletes using banned performance-enhancing substances, putting individual glory ahead of the excellence of the team, engaging in disrespectful and even violent behavior towards opponents, and seeking victory above all else. We are also given a steady diet of more salacious stories that include various embarrassing, immoral, and illegal behaviors in the private lives (...) of elite athletes. Elite sport is not alone in this; youth sport has its own set of moral problems. Parents assault officials, undermine coaches, encourage a win-at-all costs mentality, and in many cases ruin sport for their children. (shrink)
The article appraises Habermas's recent writings on theology and social theory and their relevance to a new sociology of religion in the `post-secular society'. Beginning with Kant's Religion Within the Limits of Reason Alone, Habermas revisits his earlier thesis of the `linguistification of the sacred', arguing for a `rescuing translation' of the traditional contents of religious language through pursuit of a via media between an overconfident project of modernizing secularization, on the one hand, and a fundamentalism of religious orthodoxies, on (...) the other. Several questions, however, must be raised about this current project. How far can Habermas engage adequately with religious ideas of the absolute while still retaining certain broadly functionalist theoretical premises? Is the notion of an ongoing secularization process in the `post-secular society' a contradiction in terms? What appropriate `limits and boundaries' are to be accepted between the domains of knowledge and faith, and how strictly can they be drawn? How coherent is the notion of `methodological atheism', and how consistently can Habermas pursue the project of a `religious genealogy of reason'? (shrink)
In identifying intrinsic molecular chance and extrinsic adaptive pressures as the only causally relevant factors in the process of evolution, the theoretical perspective of the Modern Synthesis had a major impact on the perceived tenability of an ontology of dispositional properties. However, since the late 1970s, an increasing number of evolutionary biologists have challenged the descriptive and explanatory adequacy of this “chance alone, extrinsic only” understanding of evolutionary change. Because morphological studies of homology, convergence, and teratology have revealed a space (...) of possible forms and phylogenetic trajectories that is considerably more restricted than expected, evo-devo has focused on the causal contribution of intrinsic developmental processes to the course of evolution. Evo-devo’s investigation into the developmental structure of the modality of morphology – including both the possibility and impossibility of organismal form – has led to the utilisation of a number of dispositional concepts that emphasise the tendency of the evolutionary process to change along certain routes. In this sense, and in contrast to the perspective of the Modern Synthesis, evo-devo can be described as a “science of dispositions.” This chapter discusses the recent philosophical literature on dispositional properties in evo-devo, exploring debates about both the metaphysical and epistemological aspects of the central dispositional concepts utilised in contemporary evo-devo and addressing the epistemological question of how dispositional properties challenge existing explanatory models in evolutionary biology. (shrink)
This volume contains selections from the philosophical writings of James Frederick Ferrier. Ferrier was the Professor of Moral Philosophy at the University of St Andrews between 1845 and 1864 and he was one of the earliest post-Hegelian British idealists. He develops a system of absolute idealism via a rejection of the Scottish school of common sense and Enlightenment philosophy in general. These selections focus on his primary philosophical interests: epistemology and ontology. Ferrier denies the possibility of a science of man (...) and suggests that philosophy should focus on self-consciousness, the defining feature of humanity. Jennifer Keefe is assistant professor of philosophy at the University of Wisconsin-Parkside. (shrink)
James Frederick Ferrier James Frederick Ferrier was a mid-nineteenth-century Scottish metaphysician who developed the first post-Hegelian system of idealism in Britain. Unlike the British Idealists in the latter half of the nineteenth century, he was neither a Kantian nor a Hegelian. Instead, he largely develops his idealist metaphysics via his defense of Berkeley and … Continue reading Ferrier, James Frederick →.
The great Falsification Debate about the logical status of religious beliefs seems fairly quiescent at present. Most philosophers of religion have opted for one or the other of two opposite responses to the falsificationists' challenge.
This is the first book to address philosophically the moral and political underpinnings of terrorism and anti-terrorism. It brings together authors with different attitudes and original perspectives on attitudes and ethical and practical justifications for terrorism.
Vagueness is currently the subject of vigorous debate in the philosophy of logic and language. Vague terms -- such as 'tall', 'red', 'bald', and 'tadpole' -- have borderline cases ; and they lack well-defined extensions. The phenomenon of vagueness poses a fundamental challenge to classical logic and semantics, which assumes that propositions are either true or false and that extensions are determinate.This anthology collects for the first time the most important papers in the field. After a substantial introduction that surveys (...) the field, the essays form four groups, starting with some historically notable pieces. The 1970s saw an explosion of interest in vagueness, and the second group of essays reprints classic papers from this period. The following group of papers represent the best recent work on the logic and semantics of vagueness. The essays in the final group are contributions to the continuing debate about vague objects and vague identity. (shrink)
Austin’s Sense and Sensibilia (1962) generates wildly different reactions among philosophers. Interpreting Austin on perception starts with a reading of this text, and this in turn requires reading into the lectures key ideas from Austin’s work on natural language and the theory of knowledge. The lectures paint a methodological agenda, and a sketch of some first-order philosophy, done the way Austin thinks it should be done. Crucially, Austin calls for philosophers to bring a deeper understanding (...) of natural language meaning to bear as they do their tasks. In consequence Austin’s lectures provide a fascinating start—but only a start—on a number of key questions in the philosophy of perception. (shrink)
In modern jurisprudence it is taken as axiomatic that John Austin's sanction-based account of law and legal obligation was demolished in H.L.A. Hart's The Concept of Law, but Hart's victory and the deficiencies of the Austinian account may not be so clear. Not only does the alleged linguistic distinction between being obliged and having an obligation fail to provide as much support for the idea of a sanction-independent legal obligation as is commonly thought, but the soundness of Hart's claims, (...) as well as the claims of many legal theorists who have followed him, depend on a contested view of the nature of legal theory. If the task of a theory of law, as Joseph Raz and others have influentially argued, is to identify the essential features of the concept of law, then the theoretical possibility, if not the empirical reality, of a sanction-free legal system is what is most important. But if the task of a theory of law is to provide philosophical and theoretical illumination of law as it exists and as it is experienced, then a theory of law that fails to give a central place to law's coercive reality may for that reason be deficient as a theory of law. The question of the soundness of the Austinian account, therefore, may be a function of the answer to the question of what a theory of law is designed to accomplish. (shrink)
From the mid-nineteenth century to the early twentieth century British Idealism was a leading school of philosophical thought and the Scottish Idealists made important contributions to this philosophical school. In Scotland, there were two types of post-Hegelian idealism: Absolute Idealism and Personal Idealism. This article will show the ways in which these philosophical systems arose by focusing on their leading representatives: Edward Caird and Andrew Seth Pringle-Pattison.