ABSTRACTThis article discusses some common narratives found in discourses on national identity in Russia and Japan, and their temporal transformations reflecting the needs of a nation as it becomes a colonial empire. National identity discourse is examined from the viewpoint of national antagonism arising from an external threat. Russian and Japanese intellectuals, with their vastly different historical and cultural heritage, have dwelled upon similar issues pertaining to modernization of the state and adoption or rejection of foreign ideas and ways of (...) life. There are several themes in Russian and Japanese discourses on national identity that share a significant overlap, particularly themes of national uniqueness and a ‘special path’, deterministic worldviews, imperial cosmopolitanism/messianism and criticism of ‘Western’ philosophical systems and concepts. This article elucidates the shared aspects of these narratives and philosophical inquiries in Russia and Japan and puts them into a historical context. (shrink)
Moral encroachment holds that the epistemic justification of a belief can be affected by moral factors. If the belief might wrong a person or group more evidence is required to justify the belief. Moral encroachment thereby opposes evidentialism, and kindred views, which holds that epistemic justification is determined solely by factors pertaining to evidence and truth. In this essay I explain how beliefs such as ‘that woman is probably an administrative assistant’—based on the evidence that most women employees at the (...) firm are administrative assistants—motivate moral encroachment. I then describe weaknesses of moral encroachment. Finally I explain how we can countenance the moral properties of such beliefs without endorsing moral encroachment, and I argue that the moral status of such beliefs cannot be evaluated independently from the understanding in which they are embedded. (shrink)
In order to perform certain actions – such as incarcerating a person or revoking parental rights – the state must establish certain facts to a particular standard of proof. These standards – such as preponderance of evidence and beyond reasonable doubt – are often interpreted as likelihoods or epistemic confidences. Many theorists construe them numerically; beyond reasonable doubt, for example, is often construed as 90 to 95% confidence in the guilt of the defendant. -/- A family of influential cases suggests (...) standards of proof should not be interpreted numerically. These ‘proof paradoxes’ illustrate that purely statistical evidence can warrant high credence in a disputed fact without satisfying the relevant legal standard. In this essay I evaluate three influential attempts to explain why merely statistical evidence cannot satisfy legal standards. (shrink)
According to a common conception of legal proof, satisfying a legal burden requires establishing a claim to a numerical threshold. Beyond reasonable doubt, for example, is often glossed as 90% or 95% likelihood given the evidence. Preponderance of evidence is interpreted as meaning at least 50% likelihood given the evidence. In light of problems with the common conception, I propose a new ‘relevant alternatives’ framework for legal standards of proof. Relevant alternative accounts of knowledge state that a person knows a (...) proposition when their evidence rules out all relevant error possibilities. I adapt this framework to model three legal standards of proof—the preponderance of evidence, clear and convincing evidence, and beyond reasonable doubt standards. I describe virtues of this framework. I argue that, by eschewing numerical thresholds, the relevant alternatives framework avoids problems inherent to rival models. I conclude by articulating aspects of legal normativity and practice illuminated by the relevant alternatives framework. (shrink)
The epistemology of risk examines how risks bear on epistemic properties. A common framework for examining the epistemology of risk holds that strength of evidential support is best modelled as numerical probability given the available evidence. In this essay I develop and motivate a rival ‘relevant alternatives’ framework for theorising about the epistemology of risk. I describe three loci for thinking about the epistemology of risk. The first locus concerns consequences of relying on a belief for action, where those consequences (...) are significant if the belief is false. The second locus concerns whether beliefs themselves—regardless of action—can be risky, costly, or harmful. The third locus concerns epistemic risks we confront as social epistemic agents. I aim to motivate the relevant alternatives framework as a fruitful approach to the epistemology of risk. I first articulate a ‘relevant alternatives’ model of the relationship between stakes, evidence, and action. I then employ the relevant alternatives framework to undermine the motivation for moral encroachment. Finally, I argue the relevant alternatives framework illuminates epistemic phenomena such as gaslighting, conspiracy theories, and crying wolf, and I draw on the framework to diagnose the undue skepticism endemic to rape accusations. (shrink)
This paper argues that the practical implementation of blockchain technology can be considered an institution of property similar to legal institutions. Invoking Penner's theory of property and Hegel's system of property rights, and using the example of bitcoin, it is possible to demonstrate that blockchain effectively implements all necessary and sufficient criteria for property without reliance on legal means. Blockchains eliminate the need for a third-party authority to enforce exclusion rights, and provide a system of universal access to knowledge and (...) discoverability about the property rights of all participants and how the system functions. The implications of these findings are that traditional property relations in society could be replaced by or supplemented with blockchain models, and implemented in new domains. (shrink)
This chapter argues that the choice of trust conceptualisations in the context of consumer Internet of Things (IoT) can have a significant impact on the understanding and implementations of a user’s private data protection. Narrow instrumental interpretations of trust as a mere precondition for technology acceptance may obscure important moral issues such as malleability of user’s privacy decisions, and power imbalances between suppliers and consumers of technology. A shift of focus in policy proposals from trust to the trustworthiness of technology (...) can be the first step on the way to addressing these moral concerns. It is argued that complexity of IoT systems, comprised of technological artefacts and institutional data-collecting entities, warrants the moral value of distrust as a prima facie assumption for technological design and regulatory measures. Such a conceptual perspective highlights importance of technological measures that can minimise reliance on trust in consumer IoTs and regulatory measures aimed to improve transparency of IoT architectures. (shrink)
This paper looks at the development of blockchain technologies that promise to bring new tools for the management of private data, providing enhanced security and privacy to individuals. Particular interest present solutions aimed at reorganizing data flows in the Internet of Things architectures, enabling the secure and decentralized exchange of data between network participants. However, as this paper argues, the promised benefits are counterbalanced by a significant shift towards the propertization of private data, underlying these proposals. Considering the unique capacity (...) of blockchain technology applications to imitate and even replace traditional institutions, this aspect may present certain challenges, both of technical and ethical character. In order to highlight these challenges and associated concerns, this paper identifies the underlying techno-economic factors and normative assumptions defining the development of these solutions amounting to technologically enabled propertization. It is argued that without careful consideration of a wider impact, such blockchain applications could have effects opposite to the intended ones, thus contributing to the erosion of privacy for IoT users. (shrink)
Self-sovereign identity solutions implemented on the basis of blockchain technology are seen as alternatives to existing digital identification systems, or even as a foundation of standards for the new global infrastructures for identity management systems. It is argued that ‘self-sovereignty' in this context can be understood as the concept of individual control over identity relevant private data, capacity to choose where such data is stored, and the ability to provide it to those who need to validate it. It is also (...) argued that while it might be appealing to operationalise the concept of ‘self-sovereignty’ in a narrow technical sense, depreciation of moral semantics obscures key challenges and long-term repercussions. Closer attention to the normative substance of the ‘sovereignty’ concept helps to highlight a range of ethical issues pertaining to the changing nature of human identity in the context of ubiquitous private data collection. (shrink)
The teleological approach to an epistemic concept investigates it by asking questions such as ‘what is the purpose of the concept?’, ‘What role has it played in the past?’, or ‘If we imagine a society without the concept, why would they feel the need to invent it?’ The idea behind the teleological approach is that examining the function of the concept illuminates the contours of the concept itself. This approach is a relatively new development in epistemology, and as yet there (...) are few works examining it. This paper aims to fill this gap and engender further understanding of the teleological method. I first contrast the teleological method with more orthodox approaches in epistemology. I then draw a three-way taxonomy of different kinds of teleological approach and provide an example of each kind. The teleological approach is often presented as antithetical to the more orthodox approaches in epistemology, and so in competition with them. I demur. I argue that the methods can be fruitfully combined in epistemological theorising; in the final section I suggest specific ways the teleological approach can be incorporated alongside more orthodox methods in a general methodological reflective equilibrium. (shrink)
This essay is an accessible introduction to the proof paradox in legal epistemology. -/- In 1902 the Supreme Judicial Court of Maine filed an influential legal verdict. The judge claimed that in order to find a defendant culpable, the plaintiff “must adduce evidence other than a majority of chances”. The judge thereby claimed that bare statistical evidence does not suffice for legal proof. -/- In this essay I first motivate the claim that bare statistical evidence does not suffice for legal (...) proof. I then introduce and motivate a knowledge-centred explanation of this fact. The knowledge-centred explanation rests on two premises. The first is that legal proof requires knowledge of culpability. The second is that one cannot attain knowledge that p from bare statistical evidence that p. To motivate the second premise, I suggest that beliefs based on bare statistical evidence fail to be safe—they could easily be wrong—and bare statistical evidence cannot eliminate relevant alternatives. -/- I then cast doubt on the first premise; I argue that legal proof does not require knowledge. I thereby dispute the knowledge-centred explanation of the inadequacy of bare statistical evidence for legal proof. Instead of appealing to the nature of knowledge, I suggest we should seek a more direct explanation by appealing to those more foundational epistemic properties, such as safety or eliminating relevant alternatives. (shrink)
The moral significance of blockchain technologies is a highly debated and polarised topic, ranging from accusations that cryptocurrencies are tools serving only nefarious purposes such as cybercrime and money laundering, to the assessment of blockchain technology as an enabler for revolutionary positive social transformations of all kinds. Such technological determinism, however, hardly provides insights of sufficient depth on the moral significance of blockchain technology. This thesis argues rather, that very much like the cryptographic tools before them, blockchains develop in a (...) constant feedback loop. Blockchain applications are driven by values, normative assumptions, and personal commitments of researchers, which shape moral effects of technology. At the same time these very assumption are often embedded in preexisting moral conception and ethical theories, implicitly or explicitly accepted by blockchain developers. And just as the introduction of one flawed element in the cryptographic application can have mass scale effects, the introduction of flawed normative assumptions can have far reaching consequences in blockchain applications. This thesis argues that we should not take normative assumptions present in blockchain applications as given. Just like the open-source code is developed through the public revision and scrutiny, we should aim to make normative assumptions transparent and be ready to revise them in case we find some bugs. How can we qualify claims that blockchain technologies enable new types of institutions? Can blockchain technologies eliminate trust in complex socio-technical systems? What does individual sovereignty mean in the context of private data control and privacy? Whether property in private data enabled by blockchain applications can solve moral issues of privacy and commercial surveillance? Answers to these and other questions map some of the key normative assumptions present in the current blockchain applications and serve as a contribution to the open-source project of the future society built on the fundamental human values. (shrink)
The aim of this paper is to determine whether and to what extent Woodward’s interventionist theory of causation is variable relative. In an influential review, Strevens has accused Woodward’s account of a damaging form of variable relativity, according to which obviously false causal claims can be made true by choosing a depleted variable set. Following McCain, I show that Strevens’ objection doesn’t succeed. However, Woodward also wants to avoid another kind of variable relativity, according to which it can be true (...) that X is a cause of Y in one set of background conditions, but false in another. I show that Woodward’s account is problematically overpermissive, unless there are restrictions on the values that certain variables can take. I formulate a modified account that makes these restrictions explicit, then use it to argue that Woodward’s attempt to avoid relativity to background conditions is misguided. On the best interpretation of the interventionist theory, causal claims are assessed relative to a particular kind of variable set. Thus, I conclude that the theory should be understood as variable relative, in a specific, unproblematic sense. (shrink)
Demonstratives and Indexicals In the philosophy of language, an indexical is any expression whose content varies from one context of use to another. The standard list of indexicals includes pronouns such as “I”, “you”, “he”, “she”, “it”, “this”, “that”, plus adverbs such as “now”, “then”, “today”, “yesterday”, “here”, and “actually”. Other candidates include the tenses … Continue reading Demonstratives and Indexicals →.
I apply James Woodward’s interventionist theory of causation to organic chemistry, modelling three different ways that chemists are able to manipulate the reaction conditions in order to control the outcome of a reaction. These consist in manipulations to the reaction kinetics, thermodynamics, and whether the kinetics or thermodynamics predominates. It is possible to construct interventionist causal models of all of these kinds of manipulation, and therefore to account for them using Woodward’s theory. However, I show that there is an alternate, (...) more illuminating way of thinking about the third kind of reaction control, according to which chemists are thought of as manipulating which causal system is instantiated. I show that our ability to manipulate which system is instantiated is an important part of our ability to control the world, as is therefore especially relevant to interventionism. Thus, considering examples from organic chemistry leads to the identification of an important extension to Woodward’s theory. Finally, this investigation into reaction control in organic chemistry also has a more general implication: it suggests that interventionism results in a version of pragmatism about causation. (shrink)
In criminal trials the state must establish, to a particular standard of proof, the defendant's guilt. The most widely used and important standard of proof for criminal conviction is the ‘beyond a reasonable doubt' standard. But what legitimates this standard, rather than an alternative? One view holds the standard of proof should be determined or justified – at least in large part – by its consequences. In this spirit, Laudan uses crime statistics to estimate risks the average citizen runs of (...) being violently victimised and falsely convicted. He argues that since the former risk is higher, and the aggregate harms are worse, the standard of proof should be substantially lowered. He presents a formula for calculating the preferred standard. In this article I outline various ways Laudan's uses of crime statistics are flawed, and explain how he substantially overestimates risks of victimhood and underestimates costs of false convictions. I also explain why his formula is mistaken, and illuminate consequences Laudan neglects. I conclude that, even if consequences determine the appropriate standard of proof, Laudan's arguments fail to show the standard is too high. I conclude by suggesting that the inadequacies of Laudan's reasoning might be good news for consequence-based justifications of the standard of proof. (shrink)
Understanding enjoys a special kind of value, one not held by lesser epistemic states such as knowledge and true belief. I explain the value of understanding via a seemingly unrelated topic, the implausibility of veritism. Veritism holds that true belief is the sole ultimate epistemic good and all other epistemic goods derive their value from the epistemic value of true belief. Veritism entails that if you have a true belief that p, you have all the epistemic good qua p. Veritism (...) is a plausible and widely held view; I argue that it is untenable. I argue that integration among beliefs possesses epistemic value independent from the good of true belief, and so has value veritism cannot account for. I argue further that this integration among beliefs comprises the distinctive epistemic value of understanding. (shrink)
This essay introduces the ‘she said, he said’ paradox for Title IX investigations. ‘She said, he said’ cases are accusations of rape, followed by denials, with no further significant case-specific evidence available to the evaluator. In such cases, usually the accusation is true. Title IX investigations adjudicate sexual misconduct accusations in US educational institutions; I address whether they should be governed by the ‘preponderance of the evidence’ standard of proof or the higher ‘clear and convincing evidence’ standard. -/- Orthodoxy holds (...) that the ‘preponderance’ standard is satisfied if the evidence adduced renders the litigated claim more likely than not. On this view, I argue, ‘she said, he said’ cases satisfy the ‘preponderance’ standard. But this consequence conflicts with plausible liberal and feminist claims. In this essay I contrast the ‘she said, he said’ paradox with legal epistemology’s proof paradox. I explain how both paradoxes arise from the distinction between individualised and non-individualised evidence, and I critically evaluate responses to the ‘she said, he said’ paradox. (shrink)
Recently, there has been a large amount of support for the idea that causal claims can be sensitive to normative considerations. Previous work has focused on the concept of actual causation, defending the claim that whether or not some token event c is a cause of another token event e is influenced by both statistical and prescriptive norms. I focus on the policy debate surrounding alternative energies, and use the causal modelling framework to show that in this context, people’s normative (...) commitments don’t just affect the causal claims they are willing to endorse, but also their understanding of the causal structure. In the context of the alternative energy debate, normative considerations affect our understanding of the causal structure by influencing our judgements about which variables should be held fixed, and therefore which variables should be relegated to the background of a causal model. In cases of extreme disagreement, normative commitments can also affect which causal structure we think should be instantiated. Thus, focusing on a new context has revealed a previously unexplored sense in which normative factors are incorporated into causal reasoning. (shrink)
My essay ‘Attunement: On the Cognitive Virtues of Attention’ is the lead essay in a symposium. Adam Carter and Sandy Goldberg each respond to the ‘Attunement’ essay. This is my rejoinder. -/- (i.) Carter argues that resources from virtue reliabilism can explain the source of attention normativity. He modifies this virtue reliabilist AAA-framework to apply to attentional normativity. I raise concerns about Carter’s project. I suggest that true belief and proper attentional habits are not relevantly similar. -/- (ii.) Goldberg claims (...) that social roles underwrite kinds of attentional normativity that are not well-captured by virtue theory. I critically assess this claim. (shrink)
Some expressions of English, like the demonstratives ‘this’ and ‘that’, are referentially promiscuous: distinct free occurrences of them in the same sentence can differ in content relative to the same context. One lesson of referentially promiscuous expressions is that basic logical properties like validity and logical truth obtain or fail to obtain only relative to a context. This approach to logic can be developed in just as rigorous a manner as David Kaplan’s classic logic of demonstratives. The result is a (...) logic that applies to arguments in English containing multiple occurrences of referentially promiscuous expressions. (shrink)
Conciliatory views of peer disagreement hold that when an agent encounters peer disagreement she should conciliate by adjusting her doxastic attitude towards that of her peer. In this paper I distinguish different ways conciliation can be understood and argue that the way conciliationism is typically understood violates the principle of commutativity of evidence. Commutativity of evidence holds that the order in which evidence is acquired should not influence what it is reasonable to believe based on that evidence. I argue that (...) when an agent encounters more than one peer, and applies the process of conciliation serially, the order she encounters the peers influences the resulting credence. I argue this is a problem for conciliatory views of disagreement, and suggest some responses available to advocates of conciliation. (shrink)
How to reconcile the theory of evolution with existing religious beliefs has occupied minds since Darwin's time. The majority of the discourse on the subject is still focused on the Darwinian version of evolutionary theory, or at best, the mid-twentieth century version of the Modern Synthesis. However, evolutionary thought has moved forward since then with the insights provided by the advent of comparative genomics in recent decades having a particularly significant impact. A theology that successfully incorporates evolutionary biology needs to (...) take such developments into account, because range of truly viable options among the many versions of theistic evolution that have been proposed in the past may narrow down when this is done. Here I present these previously underappreciated strains of contemporary evolutionary thought and discuss their potential theological impact. (shrink)
An account of the nature of knowledge must explain the value of knowledge. I argue that modal conditions, such as safety and sensitivity, do not confer value on a belief and so any account of knowledge that posits a modal condition as a fundamental constituent cannot vindicate widely held claims about the value of knowledge. I explain the implications of this for epistemology: We must either eschew modal conditions as a fundamental constituent of knowledge, or retain the modal conditions but (...) concede that knowledge is not more valuable than that which falls short of knowledge. This second horn—concluding that knowledge has no distinctive value—is unappealing since it renders puzzling why so much epistemological theorising focuses on knowledge, and why knowledge seems so important. (shrink)
In an earlier defense of the view that the fundamental logical properties of logical truth and logical consequence obtain or fail to obtain only relative to contexts, I focused on a variation of Kaplan’s own modal logic of indexicals. In this paper, I state a semantics and sketch a system of proof for a first-order logic of demonstratives, and sketch proofs of soundness and completeness. (I omit details for readability.) That these results obtain for the first-order logic of demonstratives shows (...) that the significance of demonstratives for logic exceeds their behavior as rigid designators in counterfactual reasoning, or reasoning about alternative possibilities. Furthermore, the results in this paper help address one common objection to the view that logical truth and consequence obtain only relative to contexts. According to this objection, the view entails that logical consequence is not formal. (shrink)
Thought experiments as counterexamples are a familiar tool in philosophy. Frequently understanding a vignette seems to generate a challenge to a target theory. In this paper I explore the content of the judgement that we have in response to these vignettes. I first introduce several competing proposals for the content of our judgement, and explain why they are inadequate. I then advance an alternative view. I argue that when we hear vignettes we consider the normal instances of the vignette. If (...) the normal instance of the vignette exhibits a counter-instance, the vignette constitutes a challenge to the target theory. I argue that this proposal shows how responses to vignettes are an ordinary, everyday judgement, and I explain how the proposal avoids the problems generated by competing theories. Finally, I argue this ‘normalcy proposal’ most naturally accords with our understanding of the method. (shrink)
Human creativity generates novel ideas to solve real-world problems. This thereby grants us the power to transform the surrounding world and extend our human attributes beyond what is currently possible. Creative ideas are not just new and unexpected, but are also successful in providing solutions that are useful, efficient and valuable. Thus, creativity optimizes the use of available resources and increases wealth. The origin of human creativity, however, is poorly understood, and semantic measures that could predict the success of generated (...) ideas are currently unknown. Here, we analyze a dataset of design problem-solving conversations in real-world settings by using 49 semantic measures based on WordNet 3.1 and demonstrate that a divergence of semantic similarity, an increased information content, and a decreased polysemy predict the success of generated ideas. The first feedback from clients also enhances information content and leads to a divergence of successful ideas in creative problem solving. These results advance cognitive science by identifying real-world processes in human problem solving that are relevant to the success of produced solutions and provide tools for real-time monitoring of problem solving, student training and skill acquisition. A selected subset of information content (IC Sánchez–Batet) and semantic similarity (Lin/Sánchez–Batet) measures, which are both statistically powerful and computationally fast, could support the development of technologies for computer-assisted enhancements of human creativity or for the implementation of creativity in machines endowed with general artificial intelligence. (shrink)
Multimedia technologies and ICT in organising e-portfolio for students An electronic portfolio, also known as an e-portfolio or digital portfolio, is a collection of electronic evidence assembled and managed by a user, usually on the Web. Such electronic evidence may include inputted text, electronic files such as Microsoft Word and Adobe PDF files, images, multimedia, Blog entries, and hyperlinks.One of the approaches, which can be used for improving the attractiveness of the e-portfolio is presented, and it is an implementation of (...) the multimedia technologies in it. In this case we are speaking about multimedia electronic portfolio. What makes them very different from the traditional portfolios is that they can include scanned or digital photos, video and sound clips, animations, recordings of the students, text, traditional writings and drawings.Two main groups of tools for creating multimedia e-portfolio are presented. (shrink)
Robust virtue epistemology holds that knowledge is true belief obtained through cognitive ability. In this essay I explain that robust virtue epistemology faces a dilemma, and the viability of the theory depends on an adequate understanding of the ‘through’ relation. Greco interprets this ‘through’ relation as one of causal explanation; the success is through the agent’s abilities iff the abilities play a sufficiently salient role in a causal explanation of why she possesses a true belief. In this paper I argue (...) that Greco’s account of the ‘through’ relation is inadequate. I describe kinds of counterexample and explain why salience is the wrong kind of property to track epistemically relevant conditions or to capture the nature of knowledge. Advocates of robust virtue epistemology should develop an alternative account of the ‘through’ relation. I also argue that virtue epistemology should employ an environment-relative interpretation of epistemic virtue. (shrink)
ABSTRACT Contrastive and deviant/default accounts of causation are becoming increasingly common. However, discussions of these accounts have neglected important questions, including how the context determines the contrasts, and what shared knowledge is necessary for this to be possible. I address these questions, using organic chemistry as a case study. Focusing on one example—nucleophilic substitution—I show that the kinds of causal claims that can be made about an organic reaction depend on how the reaction is modelled, and argue that paying attention (...) to the various ways that reactions are modelled has important implications for our understanding of causation. _1_ Introduction _2_ General Contrastive Causal Claims in Organic Chemistry _3_ Deviant Causal Claims in Organic Chemistry _4_ Nucleophilic Substitution Reactions _5_ The Causal Modelling Tradition _5.1_ The type/token distinction _6_ Competing Reactions _6.1_ Type- and token-causal claims, variables, and values of variables _7_ Disambiguation of ‘Reaction’ _8_ Reaction Kinds _9_ Specific Reactions _9.1_ Specific reactions and token-causal claims _9.2_ Specific reactions and type-causal claims _10_ Implications _10.1_ Kinds of causal claim _10.2_ Contrastive and deviant causal claims _10.3_ Model relativity. (shrink)
We investigate the parallelism between aesthetic experience and the practice of phenomenology using Viktor Shklovsky’s theory of “estrangement”. In his letter to Hugo von Hofmannsthal, Husserl claims that aesthetic and phenomenological experiences are similar; in the perception of a work of art we change our attitude in order to concentrate on how the things appear to us instead of what they are. A work of art “forces us into” the aesthetic attitude in the same way as the phenomenological epoché drives (...) us into the phenomenological one. The change of attitudes is a condition of possibility of aesthetic and/or phenomenological experience. Estrangement is an artistic device that breaks the routinized forms of perception: one sees the thing as new and does not just “recognize” it automatically. Shklovsky insists that it is possible if one experiences or feels the form of the work of art—in an affective and even sensuous way. We claim that this is similar to the phenomenological seeing, or intuition, which, according to Husserl, should be devoid of all understanding. Phenomenological epoché can also be described as a philosophical technique that aims to arrest the “ready-made,” “taken for granted,” “pre-given” meanings in order to access a new meaning which is not yet stabilized, the “meaning-in-formation.” It is not enough to turn from what appears to how it appears; one has to oscillate between these conflicting attitudes, or rather to keep them both at the same time thus gaining a kind of a 3D-vision of meaning in its becoming. This double life in two different attitudes can be clarified in terms of Roman Jakobson’s theory of antinomic coexistence between the poetic and communicative functions of language. The notion of “double life in two attitudes” uncovers the role that ostranenie can play in the philosophical transformation of the subject based on variety and essential mobility of the affective components involved. Proposing a phenomenological interpretation of a passage from Samuel Beckett we show how the radicalization of ostranenie can lead even to “meta-estrangement”: to estrangement of the everyday “lack of estrangement.” We conclude with a remark on the productivity of this form of estrangement in the phenomenological context. (shrink)
Stephen Davies taught philosophy at the University of Auckland, Auckland, New Zealand. His research specialty is the philosophy of art. He is a former President of the American Society for Aesthetics. His books include Definitions of Art (Cornell UP, 1991), Musical Meaning and Expression (Cornell UP, 1994), Musical Works and Performances (Clarendon, 2001), Themes in the Philosophy of Music (OUP, 2003), Philosophical Perspectives on Art (OUP, 2007), Musical Understandings and Other Essays on the Philosophy of Music (OUP, 2011), The Artful (...) Species: Aesthetics, Art, and Evolution (OUP, 2012), The Philosophy of Art (Wiley-Blackwell, 2016 second ed.), and Adornment: What Self-decorations Tells Us about Who We Are, (Bloomsbury Academic, 2020). (shrink)
I motivate three claims: Firstly, attentional traits can be cognitive virtues and vices. Secondly, groups and collectives can possess attentional virtues and vices. Thirdly, attention has epistemic, moral, social, and political importance. An epistemology of attention is needed to better understand our social-epistemic landscape, including media, social media, search engines, political polarisation, and the aims of protest. I apply attentional normativity to undermine recent arguments for moral encroachment and to illuminate a distinctive epistemic value of occupying particular social positions. A (...) recurring theme is that disproportionate attention can distort, mislead, and misrepresent even when all the relevant claims are true and well supported by evidence. In the informational cacophony of the internet age, epistemology must foreground the cognitive virtues of attunement. (shrink)
.The German zoologist and geneticist Ludwig Plate was a pupil and successor of the “German Darwin” Ernst Haeckel as the director of the Institute of Zoology at Jena University. Plate campaigned for a revival of the original Darwinism. His research program, which he labelled “old-Darwinism”, proclaimed the synthesis of selectionism with “moderate Lamarckism” and orthogenesis.This article reconstructs and analyses Plate’s “old-Darwinian” synthesis and sheds light on Plate’s controversial biography, especially his conflict with Haeckel.
The article analyzes some key motives of both classical German phenomenology and contemporary French phenomenology. The theme of sense-formation, a recurring thread throughout Husserl's entire body of work, serves as a discussion starting point.A special emphasis is put on one of Husserl's posthumously published texts from 1933, in which he distinguishes between the open process of sense-formation [Sinnbildung] and the closed sense-structures [Sinngebilde]. The “phenomenon” to which phenomenological philosophy refers here is not a “pre-given thing” yet, but rather the horizon (...) in which its sense is shaped. This fundamental intuition is crucially important for the project of “nonstandard” phenomenology, which Richir is developing in the context of Francophone philosophy. Drawing equally from Maurice Merleau-Ponty’ phenomenology of language, Richir refers to “the sense that creates itself.” In this way, he is continuing to develop one of the key intuitio... (shrink)