Taking a schizoanalytic approach to audio-visual images, this article explores some of the radical potentia for deterritorialisation found within David Fincher's Fight Club (1999). The film's potential for deterritorialisation is initially located in an exploration of the film's form and content, which appear designed to interrogate and transcend a series of false binaries between mind and body, inside and outside, male and female. Paying attention to the construction of photorealistic digital spaces and composited images, we examine the actual (and (...) possible) ways viewers relate to the film, both during and after screenings. Recognising the film as an affective force performing within our world, we also investigate some of the real-world effects the film catalysed. Finally, we propose that schizoanalysis, when applied to a Hollywood film, suggests that Deleuze underestimated the deterritorialising potential of contemporary, special effects-driven cinema. If schizoanalysis has thus been reterritorialised by mainstream products, we argue that new, ‘post-Deleuzian’ lines of flight are required to disrupt this ‘de-re-territorialisation’. (shrink)
In several works on modality, G. H. von Wright presents tree structures to explain possible worlds. Worlds that might have developed from an earlier world are possible relative to it. Actually possible worlds are possible relative to the world as it actually was at some point. Many logically consistent worlds are not actually possible. Transitions from node to node in a tree structure are probabilistic. Probabilities are often more useful than similarities between worlds in treating counterfactual conditionals.
At least since Descartes, philosophers have been interested in the special knowledge or authority that we exhibit when we speak about our own thoughts, attitudes, and feelings. This book contends that even the best work in contemporary philosophy of mind fails to account for this sort of knowledge or authority because it does not pay the right sort of attention to the notion of expression. What's at stake is not only how to understand self-knowledge and first-person authority, but also what (...) it is that distinguishes conscious from unconscious psychological states, what the mental life of a nonlinguistic animal has in common with our sort of mental life, and how to think about Wittgenstein's legacy to the philosophy of mind. (shrink)
In the form of inference known as inference to the best explanation there are various ways to characterise what is meant by the best explanation. This paper considers a number of such characterisations including several based on confirmation measures and several based on coherence measures. The goal is to find a measure which adequately captures what is meant by 'best' and which also yields the truth with a high degree of probability. Computer simulations are used to show that the overlap (...) coherence measure achieves this goal, enabling the true explanation to be identified almost as often as an approach which simply selects the most probable explanation. Further advantages to this approach are also considered in the case where there is uncertainty in the prior probability distribution. (shrink)
For a stable visual world, the colours of objects should appear the same under different lights. This property of colour constancy has been assumed to be fundamental to vision, and many experimental attempts have been made to quantify it. I contend here, however, that the usual methods of measurement are either too coarse or concentrate not on colour constancy itself, but on other, complementary aspects of scene perception. Whether colour constancy exists other than in nominal terms remains unclear.
This paper provides a new approach to inference to the best explanation based on a new coherence measure for comparing how well hypotheses explain the evidence. It addresses a number of criticisms of the use of probabilistic measures in this context by Clark Glymour, including limitations of earlier work on IBE. Computer experiments are used to show that the new approach finds the truth with a high degree of accuracy in hypothesis selection tasks and that in some cases its accuracy (...) is greater than hypothesis selection based on maximizing posterior probability. Hence, by overcoming some of the problems with the previous approach, this work provides a more adequate defence of IBE and suggests that IBE not only tracks truth but also has practical advantages over the previous approach. Applications of the new approach to parameter estimation and model selection are also explored. (shrink)
Two of the probabilistic measures of coherence discussed in this paper take probabilistic dependence into account and so depend on prior probabilities in a fundamental way. An example is given which suggests that this prior-dependence can lead to potential problems. Another coherence measure is shown to be independent of prior probabilities in a clearly defined sense and consequently is able to avoid such problems. The issue of prior-dependence is linked to the fact that the first two measures can be understood (...) as measures of coherence as striking agreement, while the third measure represents coherence as agreement. Thus, prior (in)dependence can be used to distinguish different conceptions of coherence. (shrink)
This paper presents a new argument for the likelihood ratio measure of confirmation by showing that one of the adequacy criteria used in another argument can be replaced by a more plausible and better supported criterion which is a special case of the weak likelihood principle. This new argument is also used to show that the likelihood ratio measure is to be preferred to a measure that has recently received support in the literature.
Naive mereology studies ordinary, common-sense beliefs about part and whole. Some of the speculations in this article on naive mereology do not bear directly on Peter van Inwagen's "Material Beings". The other topics, (1) and (2), both do. (1) Here is an example of Peter Unger's "Problem of the Many". How can a table be a collection of atoms when many collections of atoms have equally strong claims to be that table? Van Inwagen invokes fuzzy sets to solve this problem. (...) I claim that an alternative treatment of vagueness, supervaluations over many-value valuations, provides a better solution. (2) The Special Composition Question asks how parts compose a whole. One who rejects van Inwagen's answer in terms of constituting a life need not provide some alternative answer. Even if all answers to the Special Question fail, there are a multitude of less general composition questions that are not so difficult. (shrink)
A primary purpose of argument is to increase the degree of reasonable confidence that one has in the truth of the conclusion. A question begging argument fails this purpose because it violates what W. E. Johnson called an epistemic condition of inference. Although an argument of the sort characterized by Robert Hoffman in his response (Analysis 32.2, Dec 71) to Richard Robinson (Analysis 31.4, March 71) begs the question in all circumstances, we usually understand the charge that an argument is (...) question begging with reference to the beliefs of the person, or the sort of person, to whom the argument is directed. (shrink)
Self-consciousness and the self -- Diachronic unity, diachronic singularity, and the subject of consciousness -- A modal argument for immateriality -- Intelligibility concerns and causal objections -- Concluding remarks.
Pragmatism is a distinctive approach to clinical research ethics that can guide bioethicists and members of institutional review boards (IRBs) as they struggle to balance the competing values of promoting medical research and protecting human subjects participating in it. After defining our understanding of pragmatism in the setting of clinical research ethics, we show how a pragmatic approach can provide guidance not only for the day-to-day functioning of the IRB, but also for evaluation of policy standards, such as the one (...) that addresses acceptable risks for healthy children in clinical research trials. We also show how pragmatic considerations might influence the debate about the use of deception in clinical research. Finally, we show how a pragmatic approach, by regarding the promotion of human research and the protection of human subjects as equally important values, helps to break down the false dichotomy between science and ethics in clinical research. (shrink)
Everything red is colored, and all squares are polygons. A square is distinguished from other polygons by being four-sided, equilateral, and equiangular. What distinguishes red things from other colored things? This has been understood as a conceptual rather than scientific question. Theories of wavelengths and reflectance and sensory processing are not considered. Given just our ordinary understanding of color, it seems that what differentiates red from other colors is only redness itself. The Cambridge logician W. E. Johnson introduced the terms (...) determinate and determinable to apply to examples such as red and colored. Chapter XI, of Johnson's Logic, Part I (1921), “The Determinate and the Determinable,” is the main text for discussion of this distinction. (shrink)
A dualistic, discarnate picture haunts contemporary cognitive science of religion. Cognitive scientists of religion generally assert or assume a reductive physicalism, primarily through unconscious mental mechanisms that detect supernatural agency where none exists and a larger purpose to life when none exists. Accompanying this focus is a downplaying of conscious reflection in religious belief and practice. Yet the mind side of dualism enters into CSR in interesting ways. Some cognitive scientists turn practitioners of religion into dualists who allegedly believe in (...) disembodied spirits. By emphasizing supernatural agency, CSR neglects nonpersonal powers and meanings in religion, both in terms of magical thinking and practice and of nonpersonal conceptions of divinity. Additionally, some cognitive scientists of religion declare that all humans are innate dualists. They use this alleged dualism to explain beliefs about both an afterlife and transfers of consciousness. Finally, some call on this dualism to serve a salvific function, trying to salvage some meaning to human life. (shrink)
“Toda interpretación pende, juntamente con lo interpretado, en el aire; no puede servirle de apoyo. Las interpretaciones solas no determinan el significado”Wittgenstein, Investigaciones filosóficas § 198IntroducciónLa obra del filósofo estadounidense David H. Finkelstein, Expression and the Inner, publicada originariamente en 2003 por Harvard University Press (2ª ed. 2008) puede ahora leerse en la versión española de Lino San Juan, editada por la ovetense KRK Ediciones con el título: La expr.
I criticize and emend J L Mackie's account of causal priority by replacing ‘fixity’ in its central clause by 'x is a causal condition of y, but y is not a causal condition of x'. This replacement works only if 'is a causal condition of' is not a symmetric relation. Even apart from our desire to account for causal priority, it is desirable to have an account of nonsymmetric conditionship. Truth, for example, is a condition of knowledge, but knowledge is (...) not a condition of truth. My definitions of 'sufficient condition for' and 'necessary condition for' do not imply that p is a sufficient condition of q if and only if q is a necessary condition of p. (shrink)
We present a conceptual framework on the experience of time and provide a coherent basis on which to base further inquiries into qualitative approaches concerning time experience. We propose two Time-Layers and two Time-Formats forming four Time-Domains. Micro-Flow and Micro-Structure represent the implicit phenomenal basis, from which the explicit experiences of Macro-Flow and Macro-Structure emerge. Complementary to this theoretical proposal, we present empirical results from qualitative content analysis obtained from 25 healthy participants. The data essentially corroborate the theoretical proposal. With (...) respect to Flow, the phenomenally accessible time experience appeared as a continuous passage reaching from the past through the present into the future. With respect to Structure, the individual present was embedded in the individual biography, emerging from past experiences and comprising individual plans and goals. New or changing plans and goals were being integrated into the existing present, thus forming a new present. The future appeared as changeable within the present, by means from the past, and therefore as a space of potential opportunities. Exemplarily, we discuss these results in relation to previous empirical findings on deviant experiences of time in Autism Spectrum Disorder that is presumably characterized by a breakdown of Flow and concomitant compensatory repetition resulting in an overly structured time. Finally, we speculate about possible implications of these findings both for psychopathological and neuroscientific research. (shrink)
Locke thought it was a necessary truth that no two material bodies could be in the same place at the same time. Leibniz wasn't so sure. This paper sides with Leibniz. I examine the arguments of David Wiggins in defense of Locke on this point (Philosophical Review, January 1968). Wiggins’ arguments are ineffective.
A semantics of vagueness should reject the principle that every statement has a truth-value yet retain the classical tautologies. A many-value, non-truth-functional semantics and a semantics of super-valuations each have this result. According to the super-valuation approach, 'if a man with n hairs on his head is bald, then a man with n plus one hairs on his head is also bald' is false because it comes out false no matter how the vague predicate 'is bald' is appropriately made precise. (...) But why should a sentence in which components actually remain imprecise be regarded as actually false just because it would be false if its components were precise? On one of the alternative treatments of quantification allowed by the many-value approach, the sentence in question is assigned an intermediate value closer to 'false' than to 'true'. Despite the elegance of the super-valuation approach, there are reasons to prefer the many-value approach. (shrink)
While the important challenges of public deliberations on emerging technologies are crucial to keep in mind, this paper argues that scholars and practitioners have reason to be more confident in their performance of participatory technology assessments (pTA). Drawing on evidence from the 2008 National Citizens’ Technology Forum (NCTF) conducted by the Center for Nanotechnology in Society at Arizona State University, this paper describes how pTA offers a combination of intensive and extensive qualities that are unique among modes of engagement. In (...) the NCTF, this combination led to significant learning and opinion changes, based on what can be characterized as a high-quality deliberation. The quality of the anticipatory knowledge required to address emerging technologies is always contested, but pTAs can be designed with outcomes in mind—especially when learning is understood as an outcome. (shrink)
In this paper we explore challenges facing leadership in a culture of “all consuming images” from a perspective which claims that images have a moral or normative dimension. The cumulative effect of contemporary image saturation is increased resistance to the normative power of an image. We also suggest that in a culturally diverse global economy, it is necessary to expand the moral aspects of good business leadership beyond providing a basis for productive, coherent group identity within a firm at the (...) expense of seeing outsiders as “others.” We also explore what imagining leadership in business might be like in a world in which visual images shape our understandings of individual and group identity. While our focus is on leadership in business, we also use examples from the political arena. We also suggest that imagining business leadership in the ways we propose may be helpful to women, providing them with an image of business leadership more closely reflective of their experience of corporate culture, its limits, and possibilities. (shrink)
Richard Dawkins has a dilemma when it comes to design arguments. On the one hand, he maintains that it was Darwin who killed off design and so implies that his rejection of design depends upon the findings of modern science. On the other hand, he follows Hume when he claims that appealing to a designer does not explain anything and so implies that rejection of design need not be based on the findings of modern science. These contrasting approaches lead to (...) the following dilemma: if he claims that Darwinism is necessary for rejecting design, he has no satisfactory response to design arguments based on the order in the laws of physics or the fine-tuning of the physical constants; alternatively, if Humean arguments are doing most of the work, this would undermine one of his main contentions, that atheism is justified by science and especially by evolution. In any case, his Humean arguments do not provide a more secure basis for his atheism because they are seriously flawed. A particular problem is that his argument for the improbability of theism rests on a highly questionable application of probability theory since, even if it were sound, it would only establish that the prior probability of God’s existence is low, a conclusion which is compatible with the posterior probability of God’s existence being high. (shrink)