Population-level biomedical research offers new opportunities to improve population health, but also raises new challenges to traditional systems of research governance and ethical oversight. Partly in response to these challenges, various models of public involvement in research are being introduced. Yet, the ways in which public involvement should meet governance challenges are not well understood. We conducted a qualitative study with 36 experts and stakeholders using the World Café method to identify key governance challenges and explore how public involvement can (...) meet these challenges. This brief report discusses four cross-cutting themes from the study: the need to move beyond individual consent; issues in benefit and data sharing; the challenge of delineating and understanding publics; and the goal of clarifying justifications for public involvement. The report aims to provide a starting point for making sense of the relationship between public involvement and the governance of population-level biomedical research, showing connections, potential solutions and issues arising at their intersection. We suggest that, in population-level biomedical research, there is a pressing need for a shift away from conventional governance frameworks focused on the individual and towards a focus on collectives, as well as to foreground ethical issues around social justice and develop ways to address cultural diversity, value pluralism and competing stakeholder interests. There are many unresolved questions around how this shift could be realised, but these unresolved questions should form the basis for developing justificatory accounts and frameworks for suitable collective models of public involvement in population-level biomedical research governance. No data are available. (shrink)
Arguing About Language presents a comprehensive selection of key readings on fundamental issues in the philosophy of language. It offers a fresh and exciting introduction to the subject, addressing both perennial problems and emerging topics. Classic readings from Frege, Russell, Kripke, Chomsky, Quine, Grice, Lewis and Davidson appear alongside more recent pieces by philosophers or linguists such as Robyn Carston, Delia Graff Fara, Frank Jackson, Ernie Lepore & Jerry Fodor, Nathan Salmon, Zoltán Szabó, Timothy Williamson and Crispin Wright. Organised (...) into clear sections, readings have been chosen that engage with one another and often take opposing views on the same question, helping students to get to grips with the key areas of debate in the philosophy of language, including: sense and reference definite descriptions linguistic conventions language and behaviour descriptivism and rigidity contextualism vagueness rule-following and normativity fictional discourse. Each article selected is clear, thought-provoking and free from unnecessary jargon. The editors provide lucid introductions to each section in which they give an overview of the debate and outline the arguments of the papers. Arguing About Language is an ideal reader for students looking for a balanced yet up-to-date introduction to the philosophy of language. Darragh Byrne is lecturer in philosophy at the University of Birmingham, UK. Max Kölbel is ICREA Research Professor at the University of Barcelona, Spain. He is the author of Truth without Objectivity (Routledge, 2002) and co-editor of Wittgenstein's Lasting Significance (Routledge, 2004) with Bernhard Weiss, as well as Relative Truth (Oxford, 2008) with Manuel García-Carpintero. (shrink)
The simple knowledge norm of assertion holds that one may assert that p only if one knows that p. Turri :37–45, 2011) and Williamson both argue that more is required for epistemically permissible assertion. In particular, they both think that the asserter must assert on the basis of her knowledge. Turri calls this the express knowledge norm of assertion. I defend SKNA and argue against EKNA. First, I argue that EKNA faces counterexamples. Second, I argue that EKNA assumes an (...) implausible view of permissibility on which an assertion is epistemically permissible only if it is made for a right reason, i.e., a reason that contributes to making it the case that it is epistemically permissible to make that assertion. However, the analogous view in other normative domains is both controversial and implausible. This is because it doesn’t make it possible for one to act or react rightly for the wrong reason. I suggest that proponents of EKNA have conflated requirements for φ-ing rightly with requirements for φ-ing well. Finally, I argue that proponents of SKNA can explain the intuitive defectiveness of asserting on the basis of an epistemically bad reason, even when the asserters know the content of their assertion, by arguing that the asserters are epistemically blameworthy. (shrink)
‘Intuition deniers’ are those who—like Timothy Williamson, Max Deutsch, Herman Cappelen and a few others—reject the claim that philosophers centrally rely on intuitions as evidence. This ‘Centrality’ hypothesis, as Cappelen terms it, is standardly endorsed both by traditionalists and by experimental philosophers. Yet the intuition deniers claim that Centrality is false—and they generally also suggest that this undermines the significance of experimental philosophy. Three primary types of anti-Centrality argument have cross-cut the literature thus far. These arguments, I’ll claim, have (...) differing potential consequences on metaphilosophical debate. The first sort of argument centers on worries about the term ‘intuition’—for instance, worries about whether it has clear application, or whether anything actually falls under it. Call this the Argument from Unclear Application. The second argument type involves the claim that evidence in philosophy consists not of facts about intuitions, but of facts about e.g. knowledge and causation. Call this the Argument from Antipsychologism. The third type involves an attempt to demonstrate that philosophers support their claims not via bald appeal to intuition, but via argumentation. Call this the Argument from Argumentation. Although these three arguments have merit, none of them undermines the importance of experimental philosophy. Nonetheless, they do have significant consequences for the methodological debates that dominate meta-philosophy, and for experimental philosophy in particular. (shrink)
This paper argues for two interrelated claims. The first is that the most innovative contribution of Timothy Williamson, Herman Cappelen, and Max Deutsch in the debate about the epistemology of thought experiments is not the denial of intuition and the claim of the irrelevance of experimental philosophy but the claim of epistemological continuity and the rejection of philosophical exceptionalism. The second is that a better way of implementing the claim of epistemological continuity is not Deutsch and Cappelen’s argument view (...) or Williamson’s folk psychological view. This is so because while the argument view makes the basis of the relevant classificational judgement evidentially too demanding; the folk psychological view makes it too weak and error-prone to count as an adequate explanation. Drawing from a certain reading of Aristotle’s Nichomachean Ethics that flowers in Miranda Fricker and John McDowell, I argue for the reason-responsiveness view. Like the extant views, the reason-responsiveness view vindicates the claim of epistemological continuity. But unlike the extant views, it does not share those problematic features. Further, I show that the reason-responsiveness view offers a way for champions of the claim of epistemological continuity to resist Avner Baz’s objection to the claim of epistemological continuity and his objection to the philosophical use of thought experiments while taking on board some attractive elements of his view. (shrink)
There has been very little discussion of the appropriate principles to govern a modal logic of plurals. What debate there has been has accepted a principle I call (Necinc); informally if this is one of those then, necessarily: this is one of those. On this basis Williamson has criticised the Boolosian plural interpretation of monadic second-order logic. I argue against (Necinc), noting that it isn't a theorem of any logic resulting from adding modal axioms to the plural logic PFO+, (...) and showing that the most obvious formal argument in its favour is question begging. I go on to discuss the behaviour of natural language plurals, motivating a case against (Necinc) by developing a case that natural language plural terms are not de jure rigid designators. The paper concludes by developing a model theory for modal PFO-f which does not validate (Necinc). An Appendix discusses (Necinc) in relation to counterpart theory. Of course, it would be a mistake to think that the rules for "multiple pointing" follow automatically from the rules for pointing proper. Max Black—The Elusiveness of Sets In some influential articles during the 1980s George Boolos proposed an interpretation of monadic second-order logic in terms of plural quantification [4, 5]. One objection to this proposal, pressed by Williamson [22, 456-7], focuses on the modal behaviour of plural variables, arguing that the proposed interpretation yields the wrong results in respect of the modal status of atomic predications. In the present paper I will present this objection and argue against it. In the course of developing the argument, I will have cause to consider the under-investigated question of how a logic for plurals should be extended to incorporate modal operators. (shrink)
Unconditionals are syntactic conditionals whose affirmation affirms their consequent, unconditionally. Prominent instances were addressed by J.L. Austin ('There are biscuits if you want some') and Nelson Goodman (even-if 'semifactuals'). Their detailed features are explained in a Decision-Theoretic Semantics (DTS) which extends, by certainty and relevance conditions, the "CCCP" conditional probability construal of conditionals due to Ernest Adams and others. The construal of assertions of conditionals as conditional acts, defended by Keith DeRose and Richard Grandy in 1999 against objections arising from (...) Austin's unconditionals, is shown to be incompatible with any known version of CCCP. However, an ad-hoc decision-theoretic construal of their proposal is seen to work for Austin's unconditionals, albeit for these only. The DTS account of unconditionals hinges on stochastic independence or irrelevance conditions. For Austin's variant, it also involves changes in the expectations of non-indicator random variables, i.e. properly valuational relevance relations. The widely assumed speaker's knowledge presumption for assertion in general is reconstrued non-transcendently as a proposal to update to ostensible common certainty. This imperative replaces the factive 'T-axiom', KA-> A, of modal logical explications of knowledge to yield an doxastically imperatival, but evidence-sensitive theory of assertoric truth claims. On such a basis, Max Black's and G.E. Moore's ostensibilist account of assertors representing themselves as knowing is defended against Tim Williamson's 1996 and Igor Douven's 2006 respective deontic proposals for a knowledge or rational credibility requirement. Moreover, ostensibly unconditional, unhedged assertion is reconciled with a reality of dimly perceived potential defeaters on recognizing it to be a tacit semifactual, if need be. (shrink)
Knowledge and its Limits presents a systematic new conception of knowledge as a kind of mental stage sensitive to the knower's environment. It makes a major contribution to the debate between externalist and internalist philosophies of mind, and breaks radically with the epistemological tradition of analyzing knowledge in terms of true belief. The theory casts new light on such philosophical problems as scepticism, evidence, probability and assertion, realism and anti-realism, and the limits of what can be known. The arguments are (...) illustrated by rigorous models based on epistemic logic and probability theory. The result is a new way of doing epistemology and a notable contribution to the philosophy of mind. (shrink)
Oxford Studies in Metaphysics is the forum for the best new work in this flourishing field. Much of the most interesting work in philosophy today is metaphysical in character: this new series is a much-needed focus for it. OSM offers a broad view of the subject, featuring not only the traditionally central topics such as existence, identity, modality, time, and causation, but also the rich clusters of metaphysical questions in neighbouring fields, such as philosophy of mind and philosophy of science. (...) Besides independent essays, volumes will often contain a critical essay on a recent book, or a symposium that allows participants to respond to one another's criticisms and questions. Anyone who wants to know what's happening in metaphysics can start here. Volume Two begins with a major paper on consciousness by Ned Block. Block examines 'Max Black's Objection to Mind-Body Identity', an argument for a dualism of physical and phenomenal properties, closely related to Jackson's 'knowledge argument'. His extensive exploration of this family of arguments for property dualism includes considerable discussion of John Perry and Stephen White; their responses to Block's paper complete the section on the metaphysics of consciousness.Three papers consider the thesis that the future is, in some sense, 'open'. Eli Hirsch elaborates a view according to which contingent statements about the future can be indeterminate in truth-value, while preserving 'straight logic', including a principle of bivalence. Peter Forrest defends a sort of 'growing block' theory of the passage of time, emphasizing the way such a metaphysics, combined with a truth-maker principle, can provide an analysis of natural necessity. Trenton Merricks presents a trenchant and original criticism of the 'growing block' theory of time.The volume continues with a group of papers on problems of ontology. Thomas Hofweber's paper, defending nominalism from the objection that there are 'inexpressible' properties and propositions, won the first annual Oxford Studies in Metaphysics Younger Scholar Prize. The papers by Phillip Bricker and Michael Loux examine a couple of deep divides within ontology. John Hawthorne's paper raises some extremely puzzling questions about the nature of persons, given the ontology needed for Timothy Williamson's theory of vagueness. Hawthorne uses these problems to motivate an alternative style of epistemicism. The final three papers take up several issues in the metaphysics of traditional theism. Michael Bergmann and Jeffrey Brower raise objections to combining a Platonic conception of universals with the doctrine of divine aseity; while Brian Leftow defends a non-Platonic theory of universals - a kind of divine-concept nominalism. Hud Hudson suggests that contemplation of the possibility of higher dimensions opens up new avenues in theodicy. (shrink)
The second volume in the _Blackwell Brown Lectures in Philosophy_, this volume offers an original and provocative take on the nature and methodology of philosophy. Based on public lectures at Brown University, given by the pre-eminent philosopher, Timothy Williamson Rejects the ideology of the 'linguistic turn', the most distinctive trend of 20th century philosophy Explains the method of philosophy as a development from non-philosophical ways of thinking Suggests new ways of understanding what contemporary and past philosophers are doing.
What does 'if' mean? Timothy Williamson presents a controversial new approach to understanding conditional thinking, which is central to human cognitive life. He argues that in using 'if' we rely on psychological heuristics, fast and frugal methods which can lead us to trust faulty data and prematurely reject simple theories.
This unique volume gathers Weber's writings on a broad array of themes, from the nature of work, to the political culture of democracy, to the uniqueness of the West, to the character of the family and race relations, to the role of science and the fate of ethical action in the modern world. Gathers Weber’s writings in a comprehensive collection, organized by topic. Rejuvenates a central, pivotal theme of Weberian thought: "How do we live?" and "How can we live in (...) the industrial society?” Connects Weber’s writings to contemporary issues through modern essays and editorial introductions. (shrink)
Published posthumously in the early 1920's, Max Weber's Economy and Society has since become recognized as one of the greatest sociological treatises of the 20th century, as well as a foundational text of the modern sociological imagination. The first strictly empirical comparison of social structures and normative orders conducted in world-historical depth, this two volume set of Economy and Society—now with new introductory material contextualizing Weber’s work for 21st century audiences—looks at social action, religion, law, bureaucracy, charisma, the city, and (...) the political community. Meant as a broad introduction for an educated general public, in its own way Economy and Society is the most demanding textbook yet written by a sociologist. The precision of its definitions, the complexity of its typologies, and the wealth of its historical content make the work an important challenge to our sociological thought: for the advanced undergraduate who gropes for her sense of society, for the graduate student who must develop his own analytical skills, and for the scholar who must match wits with Weber. (shrink)
(Publisher's Description) In the World Library of Psychologists series, international experts themselves present career-long collections of what they judge to be their finest pieces - extracts from books, key articles, salient research findings, and their major practical theoretical contributions. In this volume Max Velmans reflects on his long-spanning and varied career, considers the highs and lows in a brand new introduction and offers reactions to those who have responded to his published work over the years. This book offers a unique (...) and compelling collection of the best publications in consciousness studies from one of the few psychologists to treat the topic systematically and seriously. Velmans’ approach is multi-faceted and represents a convergence of numerous fields of study – culminating in fascinating insights that are of interest to philosopher, psychologist and neuroscientist alike. With continuing contemporary relevance, and significant historical impact, this collection of works is an essential resource for all those engaged or interested in the field of consciousness studies and the philosophy of the mind. (shrink)
Luther, A. R. The articulated unity of being in Scheler's phenomenology : basic drive and spirit.--Funk, R. L. Thought, values, and action.--Emad, P. Person, death, and world.--Smith, F. J. Peace and pacifism.--Scheler, M. Metaphysics and art.--Scheler, M. The meaning of suffering.
Stephen Houlgate is one of the leading Hegel scholars of the English-speaking world. In this interview he explains how he became a “Hegelian” while studying in Cambridge, and he offers a fundamental profile of his account of Hegel. The interview addresses the following questions: Why does Houlgate consider Hegel’s philosophy to be the “consummate critical philosophy”? What are the main barriers to a proper access to Hegel’s thought? Why is logic as dialectical logic still indispensable for philosophical thought? And finally, (...) what can both analytical and “continental” philosophers learn from Hegel? (shrink)
Writing from a scientifically and philosophically informed perspective, the authors provide a critical overview of the conceptual difficulties encountered in many current neuroscientific and psychological theories.
Is philosophy a unique discipline, or are its methods more like those of other sciences than many philosophers think? Timothy Williamson explains clearly and concisely how contemporary philosophers think and work, and reflects on their powers and limitations.
Through a curated selection of essays written over four decades by one of Australia’s leading philosophers, this collection demonstrates the impact of Continental philosophy on philosophical thought in Australia.
Writing in 1912, before the Bolshevik Revolution, American socialist John Spargo said that it was “inconceivable” that a democratic socialist society would ever abolish the “sacred right” of freedom of publication which had been won at so great a sacrifice. According to Spargo, “every Socialist writer of note” agreed with Karl Kautsky that the freedom of the press, and of literary production in general, is an “essential condition” of democratic socialism.
This book is a defense of the methods of analytic philosophy against a recent empirical challenge to the soundness of those methods. The challenge is raised by practitioners of “experimental philosophy” and concerns the extent to which analytic philosophy relies on intuition—in particular, the extent to which analytic philosophers treat intuitions as evidence in arguing for philosophical conclusions. Experimental philosophers say that analytic philosophers place a great deal of evidential weight on people’s intuitions about hypothetical cases and thought experiments. This (...) book argues that this view of traditional philosophical method is a myth, part of “metaphilosophical folklore.” Analytic philosophy makes regular use of hypothetical examples and thought experiments, but philosophers argue for their claims about what is true or not true in these examples and thought experiments. It is these arguments, not intuitions, that are treated as evidence for the claims. The book discusses xphi and some recent xphi studies; critiques a variety of other metaphilosophical claims; examines such famous arguments as Gettier’s refutation of the JTB theory and Kripke’s Gödel Case argument against descriptivism about proper names, and shows that they rely on reasoning rather than intuition; and finds existing critiques of xphi, the “Multiple Concepts” and “Expertise” replies, to be severely lacking. (shrink)
Dialectic of Enlightenment is undoubtedly the most influential publication of the Frankfurt School of Critical Theory. Written during the Second World War and circulated privately, it appeared in a printed edition in Amsterdam in 1947. "What we had set out to do," the authors write in the Preface, "was nothing less than to explain why humanity, instead of entering a truly human state, is sinking into a new kind of barbarism." Yet the work goes far beyond a mere critique of (...) contemporary events. Historically remote developments, indeed, the birth of Western history and of subjectivity itself out of the struggle against natural forces, as represented in myths, are connected in a wide arch to the most threatening experiences of the present. The book consists in five chapters, at first glance unconnected, together with a number of shorter notes. The various analyses concern such phenomena as the detachment of science from practical life, formalized morality, the manipulative nature of entertainment culture, and a paranoid behavioral structure, expressed in aggressive anti-Semitism, that marks the limits of enlightenment. The authors perceive a common element in these phenomena, the tendency toward self-destruction of the guiding criteria inherent in enlightenment thought from the beginning. Using historical analyses to elucidate the present, they show, against the background of a prehistory of subjectivity, why the National Socialist terror was not an aberration of modern history but was rooted deeply in the fundamental characteristics of Western civilization. Adorno and Horkheimer see the self-destruction of Western reason as grounded in a historical and fateful dialectic between the domination of external nature and society. They trace enlightenment, which split these spheres apart, back to its mythical roots. Enlightenment and myth, therefore, are not irreconcilable opposites, but dialectically mediated qualities of both real and intellectual life. "Myth is already enlightenment, and enlightenment reverts to mythology." This paradox is the fundamental thesis of the book. This new translation, based on the text in the complete edition of the works of Max Horkheimer, contains textual variants, commentary upon them, and an editorial discussion of the position of this work in the development of Critical Theory. (shrink)
Inductive logic is a theory of how one should reason in the face of uncertainty. It has applications to decision making and artificial intelligence, as well as to scientific problems.
Timothy Williamson is one of the most influential living philosophers working in the areas of logic and metaphysics. His work in these areas has been particularly influential in shaping debates about metaphysical modality, which is the topic of his recent provocative and closely-argued book *Modal Logic as Metaphysics* (2013). The present book comprises ten essays by metaphysicians and logicians responding to Williamson’s work on metaphysical modality. The authors include some of the most distinguished philosophers of modality in the (...) world, as well as several rising stars. Each essay is followed by a reply by Williamson. In addition, the book contains a major new essay by Williamson, ‘Modal science,’ concerning the role of modal claims in natural science. This book was originally published as a special issue of the *Canadian Journal of Philosophy.*. (shrink)
Author Max Black argues that language should conform to the discovered regularities of experience it is radically mistaken to assume that the conception of language is a mirror of reality.
This special issue of the Canadian Journal of Philosophy is dedicated to Timothy Williamson's work on modality. It consists of a new paper by Williamson followed by papers on Williamson's work on modality, with each followed by a reply by Williamson. -/- Contributors: Andrew Bacon, Kit Fine, Peter Fritz, Jeremy Goodman, John Hawthorne, Øystein Linnebo, Ted Sider, Robert Stalnaker, Meghan Sullivan, Gabriel Uzquiano, Barbara Vetter, Timothy Williamson, Juhani Yli-Vakkuri.
A counterpossible conditional is a counterfactual with an impossible antecedent. Common sense delivers the view that some such conditionals are true, and some are false. In recent publications, Timothy Williamson has defended the view that all are true. In this paper we defend the common sense view against Williamson’s objections.
Originally published in 1973, this book shows that methods developed for the semantics of systems of formal logic can be successfully applied to problems about the semantics of natural languages; and, moreover, that such methods can take account of features of natural language which have often been thought incapable of formal treatment, such as vagueness, context dependence and metaphorical meaning. Parts 1 and 2 set out a class of formal languages and their semantics. Parts 3 and 4 show that these (...) formal languages are rich enought to be used in the precise description of natural languages. Appendices describe some of the concepts discussed in the text. (shrink)
Is philosophy a unique discipline, or are its methods more like those of other sciences than many philosophers think? Timothy Williamson explains clearly and concisely how contemporary philosophers think and work, and reflects on their powers and limitations.
Timothy Williamson has fruitfully exploited formal resources to shed considerable light on the nature of knowledge. In the paper under examination, Williamson turns his attention to Gettier cases, showing how they can be motivated formally. At the same time, he disparages the kind of justification he thinks gives rise to these cases. He favors instead his own notion of justification for which Gettier cases cannot arise. We take issue both with his disparagement of the kind of justification that (...) figures in Gettier cases and the specifics of the formal motivation. (shrink)
Explores, at different levels, the social emotions of fellow-feeling, the sense of identity, love and hatred, and traces their relationship to one another and to the values with which they are associated. This book reviews the evaluations of love and sympathy in different historical periods and in different social and religious environments.
Eighteen leading philosophers offer critical assessments of Timothy Williamson's ground-breaking work on knowledge and its impact on philosophy today. They discuss epistemological issues concerning evidence, defeasibility, scepticism, testimony, assertion, and perception, and debate Williamson's central claim that knowledge is a mental state.
This essay criticizes Williamson’s attempt, in his book, The Philosophy of Philosophy, to undermine the interest of the a priori–a posteriori distinction. Williamson’s argument turns on several large claims. The first is that experience often plays a role intermediate between evidential and merely enabling, and that this poses a difficulty for giving a theoretically satisfying account of the distinction. The second is that there are no constitutive understanding–assent links. Both of these claims are subjected to detailed scrutiny. In (...) particular, it is argued that Williamson’s case of the deviant logician, Simon, fails to constitute an intelligible counterexample to the status of conjunction elimination as an understanding–assent link for ‘and’. (shrink)