High temporal resolution event-related brain potential and electroencephalographic coherence studies of the neural substrate of short-term storage in working memory indicate that the sustained coactivation of both prefrontal cortex and the posterior cortical systems that participate in the initial perception and comprehension of the retained information are involved in its storage. These studies further show that short-term storage mechanisms involve an increase in neural synchrony between prefrontal cortex and posterior cortex and the enhanced activation of long-term memory representations of material (...) held in short-term memory. This activation begins during the encoding/comprehension phase and evidently is prolonged into the retention phase by attentional drive from prefrontal cortex control systems. A parsimonious interpretation of these findings is that the long-term memory systems associated with the posterior cortical processors provide the necessary representational basis for working memory, with the property of short-term memory decay being primarily due to the posterior system. In this view, there is no reason to posit specialized neural systems whose functions are limited to those of short-term storage buffers. Prefrontal cortex provides the attentional pointer system for maintaining activation in the appropriate posterior processing systems. Short-term memory capacity and phenomena such as displacement of information in short-term memory are determined by limitations on the number of pointers that can be sustained by the prefrontal control systems. Key Words: coherence; event-related potentials; imaging; long-term memory; memory; short-term memory; working memory. (shrink)
The goal of our target article is to establish that electrophysiological data constrain models of short-term memory retention operations to schemes in which activated long-term memory is its representational basis. The temporary stores correspond to neural circuits involved in the perception and subsequent processing of the relevant information, and do not involve specialized neural circuits dedicated to the temporary holding of information outside of those embedded in long-term memory. The commentaries ranged from general agreement with the view that short-term memory (...) stores correspond to activated long-term memory (e.g., Abry, Sato, Schwartz, Loevenbruck & Cathiard [Abry etal.], Cowan, Fuster, Grote, Hickok & Buchsbaum, Keenan, Hyönä & Kaakinen [Keenan et al.], Martin, Morra), to taking a definite exception to this view (e.g., Baddeley, Düzel, Logie & Della Sala, Kroger, Majerus, Van der Linden, Colette & Salmon [Majerus et al.], Vallar). (shrink)
Oxford Studies in Metaphysics is dedicated to the timely publication of new work in metaphysics, broadly construed. These volumes provide a forum for the best new work in this flourishing field. They offer a broad view of the subject, featuring not only the traditionally central topics such as existence, identity, modality, time, and causation, but also the rich clusters of metaphysical questions in neighbouring fields, such as philosophy of mind and philosophy of science. This book is the eighth volume in (...) the series. It contains essays by Cian Dorr and John Hawthorne, Maya Eddon, Shamik Dasgupta, Bill Dunaway, Cody Gilmore, Ted Sider, Aaron Cotnoir, Katherine Hawley, Frabrice Correia and Sven Rosencrantz, David Braddon-Mitchell, and Ross Cameron. (shrink)
This interview with N. Katherine Hayles, one of the foremost theorists of the posthuman, explores the concerns that led to her seminal book How We Became Posthuman, the key arguments expounded in that book, and the changes in technology and culture in the ten years since its publication. The discussion ranges across the relationships between literature and science; the trans-disciplinary project of developing a methodology appropriate to their intersection; the history of cybernetics in its cultural and political context ; (...) the changed role for psychoanalysis in the technoscientific age; and the altering forms of mediated ‘embodiment’ in the posthuman context. (shrink)
The poetry and journalistic essays of Katherine Tillman often appeared in publications sponsored by the American Methodist church. Collected together for the first time, her works speak to the struggles and triumphs of African-American women.
In 1981 Eleonore Stump and Norman Kretzmann published a landmark article aimed at exploring the classical concept of divine eternity. 1 Taking Boethius as the primary spokesman for the traditional view, they analyse God's eternity as timeless yet as possessing duration. More recently Brian Leftow has seconded Stump and Kretzmann's interpretation of the medieval position and attempted to defend the notion of a durational eternity as a useful way of expressing the sort of life God leads. 2 However, there are (...) good reasons to reject the idea that divine timelessness should be thought of as having duration. The medievals probably did not accept it, as it contradicts a principle of classical metaphysics even more fundamental than the atemporality of the divine. In any case, it is not possible to express the notion of durational eternity in even a minimally coherent way, and the attempt to salvage the concept by appealing to the Thomistic doctrine of analogy is unsuccessful. The best analogy for God's eternity is still the one proposed by Augustine at the end of the fourth century. God lives in a timeless ‘present’, unextended like our temporal present, but immutable and encompassing all time. (shrink)
Katherine Hawley explores the key ideas about trust in this Very Short Introduction. Drawing on a wide range of disciplines including philosophy, psychology, and evolutionary biology, she emphasizes the nature and importance of trusting and being trusted, from our intimate bonds with significant others to our relationship with the state.
This paper provides a critical overview of recent work on epistemic blame. The paper identifies key features of the concept of epistemic blame and discusses two ways of motivating the importance of this concept. Four different approaches to the nature of epistemic blame are examined. Central issues surrounding the ethics and value of epistemic blame are identified and briefly explored. In addition to providing an overview of the state of the art of this growing but controversial field, the paper highlights (...) areas where future work is needed. (shrink)
Paul Sheehy has argued that the modal realist cannot satisfactorily allow for the necessity of God's existence. In this short paper I show that she can, and that Sheehy only sees a problem because he has failed to appreciate all the resources available to the modal realist. God may be an abstract existent outside spacetime or He may not be: but either way, there is no problem for the modal realist to admit that He exists at every concrete possible world.
In this age of DNA computers and artificial intelligence, information is becoming disembodied even as the "bodies" that once carried it vanish into virtuality. While some marvel at these changes, envisioning consciousness downloaded into a computer or humans "beamed" _Star Trek_-style, others view them with horror, seeing monsters brooding in the machines. In _How We Became Posthuman,_ N. Katherine Hayles separates hype from fact, investigating the fate of embodiment in an information age. Hayles relates three interwoven stories: how information (...) lost its body, that is, how it came to be conceptualized as an entity separate from the material forms that carry it; the cultural and technological construction of the cyborg; and the dismantling of the liberal humanist "subject" in cybernetic discourse, along with the emergence of the "posthuman." Ranging widely across the history of technology, cultural studies, and literary criticism, Hayles shows what had to be erased, forgotten, and elided to conceive of information as a disembodied entity. Thus she moves from the post-World War II Macy Conferences on cybernetics to the 1952 novel _Limbo_ by cybernetics aficionado Bernard Wolfe; from the concept of self-making to Philip K. Dick's literary explorations of hallucination and reality; and from artificial life to postmodern novels exploring the implications of seeing humans as cybernetic systems. Although becoming posthuman can be nightmarish, Hayles shows how it can also be liberating. From the birth of cybernetics to artificial life, _How We Became Posthuman_ provides an indispensable account of how we arrived in our virtual age, and of where we might go from here. (shrink)
The world is remarkably stable -- amidst the flux, physical objects continue to persist. But how do things persist? Are they spread out through time as they are spread out through space? Or is persistence very different from spatial extension? These ancient metaphysical questions are at the forefront of contemporary debate once more. Katherine Hawley provides a wide-ranging yet accessible study of this key issue. She also makes a major contribution to current debates about change, vagueness, and language.
Is there a distinctively epistemic kind of blame? It has become commonplace for epistemologists to talk about epistemic blame, and to rely on this notion for theoretical purposes. But not everyone is convinced. Some of the most compelling reasons for skepticism about epistemic blame focus on disanologies, or asymmetries, between the moral and epistemic domains. In this paper, I defend the idea that there is a distinctively epistemic kind of blame. I do so primarily by developing an account of the (...) nature of epistemic blame. My account draws on a prominent line of theorizing in moral philosophy that ties blame to our relationships with one another. I argue that with my account of epistemic blame on hand, the most compelling worries about epistemic blame can be deflated. There is a distinctively epistemic kind of blame. (shrink)
Katherine Hawley explores and compares three theories of persistence -- endurance, perdurance, and stage theories - investigating the ways in which they attempt to account for the world around us. Having provided valuable clarification of its two main rivals, she concludes by advocating stage theory.
The chapter develops a taxonomy of views about the epistemic responsibilities of citizens in a democracy. Prominent approaches to epistemic democracy, epistocracy, epistemic libertarianism, and pure proceduralism are examined through the lens of this taxonomy. The primary aim is to explore options for developing an account of the epistemic responsibilities of citizens in a democracy. The chapter also argues that a number of recent attacks on democracy may not adequately register the availability of a minimal approach to the epistemic responsibilities (...) of citizens in a democracy. (shrink)
A novel introduction to Jean-Paul Sartre’s existentialist phenomenology. Draws parallels between Sartre’s work and the work of Wittgenstein Stresses continuities rather than conflict between Sartre and Merleau-Ponty, and between Sartre and post-structuralist/post-modernist thinkers, thus corroborating ‘new Sartre’ readings Exhibits the influence of Gestalt psychology in Sartre’s descriptions of the life-world Forms part of the _Blackwell Great Minds_ series, which outlines the views of the great western thinkers and captures the relevance of these figures to the way we think and live (...) today. (shrink)
In Christ Meets Me Everywhere, Michael Cameron argues that Augustine wanted to train readers of Scripture to transpose themselves into the texts in the same way he did, by the same process of figuration that he found at its core. Tracking Augustine's developing practice of self-transposition into the figures of the biblical texts over the course of his entire career, Cameron shows that this practice is the key to Augustine's hermeneutics.
One challenge in developing an account of the nature of epistemic blame is to explain what differentiates epistemic blame from mere negative epistemic evaluation. The challenge is to explain the difference, without invoking practices or behaviors that seem out of place in the epistemic domain. In this paper, I examine whether the most sophisticated recent account of the nature of epistemic blame—due to Jessica Brown—is up for the challenge. I argue that the account ultimately falls short, but does so in (...) an instructive way. Drawing on the lessons learned, I put forward an alternative approach to the nature of epistemic blame. My account understands epistemic blame in terms of modifications to the intentions and expectations that comprise our “epistemic relationships” with one another. This approach has a number of attractions shared by Brown’s account, but it can also explain the significance of epistemic blame. (shrink)
There are moments when things suddenly seem strange - objects in the world lose their meaning, we feel like strangers to ourselves, or human existence itself strikes us as bizarre and unintelligible. Through a detailed philosophical investigation of Heidegger's concept of uncanniness (Unheimlichkeit), Katherine Withy explores what such experiences reveal about us. She argues that while others (such as Freud, in his seminal psychoanalytic essay, 'The Uncanny') take uncanniness to be an affective quality of strangeness or eeriness, Heidegger uses (...) the concept to go beyond feeling uncanny to reach the ground of this feeling in our being uncanny. -/- "Heidegger on Being Uncanny" answers those who wonder whether human existence is fundamentally strange to itself by showing that we can be what we are only if we do not fully understand what it is to be us. This fundamental finitude in our self-understanding is our uncanniness. In this first dedicated interpretation of Heidegger's uncanniness, Withy tracks this concept from his early analyses of angst through his later interpretations of the choral ode from Sophocles's Antigone. Her interpretation uncovers a novel and robust continuity in Heidegger's thought and in his vision of the human being as uncanny, and it points the way toward what it is to live well as an uncanny human being. (shrink)
The paper critically examines recent work on justifications and excuses in epistemology. I start with a discussion of Gerken’s claim that the “excuse maneuver” is ad hoc. Recent work from Timothy Williamson and Clayton Littlejohn provides resources to advance the debate. Focusing in particular on a key insight in Williamson’s view, I then consider an additional worry for the so-called excuse maneuver. I call it the “excuses are not enough” objection. Dealing with this objection generates pressure in two directions: one (...) is to show that excuses are a positive enough normative standing to help certain externalists with important cases; the other is to do so in a way that does not lead back to Gerken’s objection. I show how a Williamson-inspired framework is flexible enough to deal with both sources of pressure. Perhaps surprisingly, I draw on recent virtue epistemology. (shrink)
Katherine Hawley investigates what trustworthiness means in our lives. We become untrustworthy when we break promises, miss deadlines, or give unreliable information. But we can't be sure about what we can commit to. Hawley examines the social obstacles to trustworthiness, and explores how we can steer between overcommitment and undercommitment.
ABSTRACT:Deliberative democratic theory, commonly used to explore questions of “political” corporate social responsibility, has become prominent in the literature. This theory has been challenged previously for being overly sanguine about firm profit imperatives, but left unexamined is whether corporate contexts are appropriate contexts for deliberative theory in the first place. We explore this question using the case of Starbucks’ “Race Together” campaign to show that significant challenges exist to corporate deliberation, even in cases featuring genuinely committed firms. We return to (...) the underlying social theory to show that this is not an isolated case: for-profit firms are predictably hostile contexts for deliberation, and significant normative and strategic problems can be expected should deliberative theory be imported uncritically to corporate contexts. We close with recent advances in deliberative democratic theory that might help update the PCSR project, and accommodate the application of deliberation to the corporate context, albeit with significant alterations. (shrink)
Although the principle of fair subject selection is a widely recognized requirement of ethical clinical research, it often yields conflicting imperatives, thus raising major ethical dilemmas regarding participant selection. In this paper, we diagnose the source of this problem, arguing that the principle of fair subject selection is best understood as a bundle of four distinct sub-principles, each with normative force and each yielding distinct imperatives: fair inclusion; fair burden sharing; fair opportunity; and fair distribution of third-party risks. We first (...) map out these distinct sub-principles, and then identify the ways in which they yield conflicting imperatives for the design of inclusion and exclusion criteria, and the recruitment of participants. We then offer guidance for how decision makers should navigate these conflicting imperatives to ensure that participants are selected fairly. (shrink)
Was Descartes a Cartesian Dualist? In this controversial study, Gordon Baker and Katherine J. Morris argue that, despite the general consensus within philosophy, Descartes was neither a proponent of dualism nor guilty of the many crimes of which he has been accused by twentieth century philosophers. In lively and engaging prose, Baker and Morris present a radical revision of the ways in which Descartes' work has been interpreted. Descartes emerges with both his historical importance assured and his philosophical importance (...) redeemed. (shrink)
Distinguishing between excuses and exemptions advances our understanding of a standard range of problem cases in debates about epistemic norms. But it leaves open a problem of accounting for blameless norm violation in ‘prospective agents’. By shifting focus in our theory of excuses from rational excellence to norms governing the dispositions of agents, we can account for a fuller range of normative phenomena at play in debates about epistemic norms.
A plausible condition on having the standing to blame someone is that the target of blame's wrongdoing must in some sense be your “business”—the wrong must in some sense harm or affect you, or others close to you. This is known as the business condition on standing to blame. Many cases of epistemic blame discussed in the literature do not obviously involve examples of someone harming or affecting another. As such, not enough has been said about how an individual's epistemic (...) failing can really count as another person's business. In this paper, I deploy a relationship-based account of epistemic blame to clarify the conditions under which the business condition can be met in the epistemic domain. The basic idea is that one person's epistemic failing can be another's business in virtue of the way it impairs their epistemic relationship. (shrink)
There is a distinction between merely having the right belief, and further basing that belief on the right reasons. Any adequate epistemology needs to be able to accommodate the basing relation that marks this distinction. However, trouble arises for Bayesianism. I argue that when we combine Bayesianism with the standard approaches to the basing relation, we get the result that no agent forms their credences in the right way; indeed, no agent even gets close. This is a serious problem, for (...) it prevents us from making epistemic distinctions between agents that are doing a reasonably good job at forming their credences and those that are forming them in clearly bad ways. I argue that if this result holds, then we have a problem for Bayesianism. However, I show how the Bayesian can avoid this problem by rejecting the standard approaches to the basing relation. By drawing on recent work on the basing relation, we can develop an account of the relation that allows us to avoid the result that no agent comes close to forming their credences in the right way. The Bayesian can successfully accommodate the basing relation. (shrink)
Decades of research conducted in Western, Educated, Industrialized, Rich, & Democratic (WEIRD) societies have led many scholars to conclude that the use of mental states in moral judgment is a human cognitive universal, perhaps an adaptive strategy for selecting optimal social partners from a large pool of candidates. However, recent work from a more diverse array of societies suggests there may be important variation in how much people rely on mental states, with people in some societies judging accidental harms just (...) as harshly as intentional ones. To explain this variation, we develop and test a novel cultural evolutionary theory proposing that the intensity of kin-based institutions will favor less attention to mental states when judging moral violations. First, to better illuminate the historical distribution of the use of intentions in moral judgment, we code and analyze anthropological observations from the Human Area Relations Files. This analysis shows that notions of strict liability—wherein the role for mental states is reduced—were common across diverse societies around the globe. Then, by expanding an existing vignette-based experimental dataset containing observations from 321 people in a diverse sample of 10 societies, we show that the intensity of a society's kin-based institutions can explain a substantial portion of the population-level variation in people's reliance on intentions in three different kinds of moral judgments. Together, these lines of evidence suggest that people's use of mental states has coevolved culturally to fit their local kin-based institutions. We suggest that although reliance on mental states has likely been a feature of moral judgment in human communities over historical and evolutionary time, the relational fluidity and weak kin ties of today's WEIRD societies position these populations' psychology at the extreme end of the global and historical spectrum. (shrink)
According to Ross Cameron's version of the moving spotlight theory of time, (1) Past and future entities exist; (2) the properties and relations they have are those they have now; but nevertheless (3) there are no fundamental past- or future-tensed facts; instead, tensed facts are made true by fundamental facts about the possession of temporal distributional properties and facts about how old things are. I argue that the account isn't sufficiently distinct from the B-theory to fit the usual A-theorist's (...) tastes and arguments, since i) like the traditional spotlight it consists of a B-theoretic metaphysics with one small A-theoretic element tacked on, and since ii) in a sense it does not admit fundamental change. I also argue that the proposed grounding of tensed facts in tenseless facts does not work in certain cases. (shrink)
It is clear throughout Cognitive Gadgets Heyes believes the development of cognitive capacities results from the interaction of genes and experience. However, she opposes cognitive instincts theorists to her own view that uniquely human capacities are cognitive gadgets. Instinct theorists believe that cognitive capacities are substantially produced by selection, with the environment playing a triggering role. Heyes’s position is that humans have similar general learning capacities to those present across taxa, and that sophisticated human cognition is substantially created by our (...) socioculturally transmitted environment. It is a core strategy of Heyes to provide evidence of learning altering a cognitive capacity to conclude that a capacity is a cognitive gadget and not an instinct. We draw on recent work on the evolution of learning preparedness to examine the adequacy of this strategy. In particular, we analyse experimental evolution work showing how selection affects cognition within the laboratory. First, this work reveals that change due to learning can still be retained under genetic assimilation. This suggests that domain-specific adaptation can coexist with learning, moderate nativism, an option missed by the instinct versus gadget distinction. Second, we describe the conditions that select for increased preparedness in learning: certainty, reliability, and particular costs. We consider how these conditions can be used when conducting evolutionary reasoning about cognition, applying them to the important capacity for imitation. We find that the conditions lend theoretical support to moderate nativism about the capacity to imitate, which is supported by psychological evidence. (shrink)
It has been argued that humans can face an ethical/epistemic dilemma over the automatic stereotyping involved in implicit bias: ethical demands require that we consistently treat people equally, as equally likely to possess certain traits, but if our aim is knowledge or understanding our responses should reflect social inequalities meaning that members of certain social groups are statistically more likely than others to possess particular features. I use psychological research to argue that often the best choice from the epistemic perspective (...) is the same as the best choice from the ethical perspective: to avoid automatic stereotyping even when this involves failing to reflect social realities in our judgements. This argument has an important implication: it shows that it is not possible to successfully defend an act of automatic stereotyping simply on the basis that the stereotype reflects an aspect of social reality. An act of automatic stereotyping can be poor from an epistemic perspective even if the stereotype that is activated reflects reality. (shrink)
This article explores the relationship between pragmatic encroachment and epistemic permissiveness. If the suggestion that all epistemic notions are interest-relative is viable , then it seems that a certain species of epistemic permissivism must be viable as well. For, if all epistemic notions are interest relative then, sometimes, parties in paradigmatic cases of shared evidence can be maximally rational in forming competing basic doxastic attitudes towards the same proposition. However, I argue that this total pragmatic encroachment is not tenable, and, (...) thus, epistemic permissivism cannot be vindicated in this way. (shrink)
Evaluating counterfactuals in worlds with deterministic laws poses a puzzle. In a wide array of cases, it does not seem plausible that if a non-actual event were to occur that either the past would be different or that the laws would be different. But it’s also difficult to see how we can avoid this result. Some philosophers have argued that we can avoid this dilemma by allowing that a proposition can be a law even though it has violations. On this (...) view, for the relevant cases, the past and the laws would still hold, but the laws would have a violation. In this paper, I raise a problem for the claim that the laws and the past are preserved for all of the relevant counterfactual antecedents. I further argue that this problem undermines motivating the possibility of violations on the grounds that they allow us to hold that the past and the laws are typically counterfactually preserved, even if they are not always preserved. (shrink)
Du Châtelet’s 1740 text Foundations of Physics tackles three of the major foundational issues facing natural philosophy in the early eighteenth century: the problem of bodies, the problem of force, and the question of appropriate methodology. This paper offers an introduction to Du Châtelet’s philosophy of science, as expressed in her Foundations of Physics, primarily through the lens of the problem of bodies.
Symmetry considerations dominate modern fundamental physics, both in quantum theory and in relativity. Philosophers are now beginning to devote increasing attention to such issues as the significance of gauge symmetry, quantum particle identity in the light of permutation symmetry, how to make sense of parity violation, the role of symmetry breaking, the empirical status of symmetry principles, and so forth. These issues relate directly to traditional problems in the philosophy of science, including the status of the laws of nature, the (...) relationships between mathematics, physical theory, and the world, and the extent to which mathematics suggests new physics.This entry begins with a brief description of the historical roots and emergence of the concept of symmetry that is at work in modern science. It then turns to the application of this concept to physics, distinguishing between two different uses of symmetry: symmetry principles versus symmetry arguments. It mentions the different varieties of physical symmetries, outlining the ways in which they were introduced into physics. Then, stepping back from the details of the various symmetries, it makes some remarks of a general nature concerning the status and significance of symmetries in physics. (shrink)
Epistemological Disjunctivism is a view about paradigm cases of perceptual knowledge. Duncan Pritchard claims that it is particularly well suited to accounting for internalist and externalist intuitions. A number of authors have disputed this claim, arguing that there are problems for Pritchard’s way with internalist intuitions. I share the worry. However, I don’t think it has been expressed as effectively as it can be. My aim in this paper is to present a new way of formulating the worry, in terms (...) of an “explanatory challenge”. The explanatory challenge is a simple, yet powerful and illuminating challenge for Epistemological Disjunctivism. It is illuminating in the sense that it shows us why Epistemological Disjunctivism must take on certain internalistically problematic commitments. A secondary aim of this paper is to examine whether the recently much-discussed distinction between justifications and excuses in epistemology can support an adequate response. I will argue that it cannot. (shrink)
A leader in the Confessing Church, an outspoken opponent of Anti-Semitism, and, late in life, a committed supporter of the state of Israel, Karl Barth was nevertheless a firm and unflinching anti-Judaic theologian. _That Jesus Was Born a Jew _devotes itself to an analysis and description of these two sides of Barth's thought, from the period of the _Römerbrief_ through the Church Dogmatics and later postwar addresses. It places Barth's thought against the backdrop of his contemporaries and the developments in (...) German academic theology Barth at once repudiated and called his own. Though no claim is made to set out Judaic self-understanding, Barth's conception of the people of Israel is understood, in contrast, as specifically Christian: Barth's is a fully dogmatic interpretation of the Jews. Katherine Sonderegger traces the development of Barth's commitment to the integrity of Christian self-description. In the process, she explores the conservation of the Church's theological past that gives Barth's thought its anti-Judaic character and his Christological concentration that makes Jesus the Jew the foundation for Christian opposition to anti-Semitism and Nazism. She analyzes Church Dogmatics as well as the second edition of Romans, focusing on Barth's exegesis of the types of prophet and pharisee; and she provides an evaluation of Barth's work, with constructive proposals for the contemporary reassessment of Judaism. (shrink)
: As Heidegger acknowledges, our understanding is essentially situated and so limited by the context and tradition into which it is thrown. But this ‘situatedness’ does not exhaust Heidegger's concept of ‘thrownness’. By examining this concept and its grammar, I develop a more complete interpretation. I identify several different kinds of finitude or limitation in our understanding, and touch on ways in which we confront and carry different dimensions of our past.
Highlighting main issues and controversies, this book brings together current philosophical discussions of symmetry in physics to provide an introduction to the subject for physicists and philosophers. The contributors cover all the fundamental symmetries of modern physics, such as CPT and permutation symmetry, as well as discussing symmetry-breaking and general interpretational issues. Classic texts are followed by new review articles and shorter commentaries for each topic. Suitable for courses on the foundations of physics, philosophy of physics and philosophy of science, (...) the volume is a valuable reference for students and researchers. (shrink)
BackgroundResponsive neurostimulation has been utilized as a treatment for intractable epilepsy. The RNS System delivers stimulation in response to detected abnormal activity, via leads covering the seizure foci, in response to detections of predefined epileptiform activity with the goal of decreasing seizure frequency and severity. While thalamic leads are often implanted in combination with cortical strip leads, implantation and stimulation with bilateral thalamic leads alone is less common, and the ability to detect electrographic seizures using RNS System thalamic leads is (...) uncertain.ObjectiveThe present study retrospectively evaluated fourteen patients with RNS System depth leads implanted in the thalamus, with or without concomitant implantation of cortical strip leads, to determine the ability to detect electrographic seizures in the thalamus. Detailed patient presentations and lead trajectories were reviewed alongside electroencephalographic analyses.ResultsAnterior nucleus thalamic leads, whether bilateral or unilateral and combined with a cortical strip lead, successfully detected and terminated epileptiform activity, as demonstrated by Cases 2 and 3. Similarly, bilateral centromedian thalamic leads or a combination of one centromedian thalamic alongside a cortical strip lead also demonstrated the ability to detect electrographic seizures as seen in Cases 6 and 9. Bilateral pulvinar leads likewise produced reliable seizure detection in Patient 14. Detections of electrographic seizures in thalamic nuclei did not appear to be affected by whether the patient was pediatric or adult at the time of RNS System implantation. Sole thalamic leads paralleled the combination of thalamic and cortical strip leads in terms of preventing the propagation of electrographic seizures.ConclusionThalamic nuclei present a promising target for detection and stimulation via the RNS System for seizures with multifocal or generalized onsets. These areas provide a modifiable, reversible therapeutic option for patients who are not candidates for surgical resection or ablation. (shrink)
In De Anima 2.4, Aristotle claims that nutritive soul encompasses two distinct biological functions: nutrition and reproduction. We challenge a pervasive interpretation which posits ‘nutrients’ as the correlative object of the nutritive capacity. Instead, the shared object of nutrition and reproduction is that which is nourished and reproduced: the ensouled body, qua ensouled. Both functions aim at preserving this object, and thus at preserving the form, life, and being of the individual organism. In each case, we show how Aristotle’s detailed (...) biological analysis supports this ontological argument. (shrink)
We offer an overview of what we take to be the main themes in Annalisa Coliva’s book, Moore and Wittgenstein: Scepticism, Certainty and Common Sense. In particular, we focus on the ‘framework reading’ that she offers of Wittgenstein’s On Certainty and its anti-sceptical implications. While broadly agreeing with the proposal that Coliva puts forward on this score, we do suggest one important supplementation to the view—viz., that this way of dealing with radical scepticism needs to be augmented with an account (...) of the meta-sceptical problem which this proposal generates, which we call epistemic vertigo. (shrink)