_K. al-Manām_ by Ibn Abī al-Dunyā is a compendium of 350 Muslim dream narratives in Arabic. The English introduction examines the function of dreams in classical Arabic literature with a focus on dreams as a means of edification.
These essays engage Jin Y. Park’s recent translation of the work of Kim Iryŏp, a Buddhist nun and public intellectual in early twentieth-century Korea. Park’s translation of Iryŏp’s Reflections of a Zen Buddhist Nun was the subject of two book panels at recent conferences: the first a plenary session at the annual meeting of the Society for Asian and Comparative Philosophy and the second at the Eastern Division of the American Philosophical Association on a group program session sponsored by the (...) International Society for Buddhist Philosophy. This exchange also includes a response from Park. (shrink)
Epistemic injustice sits at the intersection of ethics, epistemology, and social justice. Generally, this philosophical term describes when a person is wrongfully discredited as a knower; and within the clinical space, epistemic injustice is the underlying reason that some patient testimonies are valued above others. The following essay seeks to connect patterns of social prejudice to the clinical realm in the United States: illustrating how factors such as race, gender identity, and socioeconomic status influence epistemic credence and associatively, the quality (...) of healthcare a person receives. After describing how epistemic injustice disproportionately harms already vulnerable patients, I propose a narrative therapy intervention. This intervention can help providers re-frame their relationships with patients, in such that they come to view patients as valuable sources of unique knowledge. Though I identify this intervention as a valuable step in addressing clinical epistemic injustice, I call upon medical educators and practitioners to further uplift the voices, perspectives, and stories of marginalized patients. (shrink)
Most public and non-profit organisations that fund health research provide the majority of their funding in the form of grants. The calls for grant applications are often untargeted, such that a wide variety of applications may compete for the same funding. The grant review process therefore plays a critical role in determining how limited research resources are allocated. Despite this, little attention has been paid to whether grant review criteria align with widely endorsed ethical criteria for allocating health research resources. (...) Here, we analyse the criteria and processes that ten of the largest public and non-profit research funders use to choose between competing grant applications. Our data suggest that research funders rarely instruct reviewers to consider disease burden or to prioritise research for sicker or more disadvantaged populations, and typically only include scientists in the review processes. This is liable to undermine efforts to link research funding to health needs. (shrink)
The vast majority of health research resources are used to study conditions that affect a small, advantaged portion of the global population. This distribution has been widely criticized as inequitable and threatens to exacerbate health disparities. However, there has been little systematic work on what individual health research funders ought to do in response. In this article, we analyze the general and special duties of research funders to the different populations that might benefit from health research. We assess how these (...) duties apply to governmental, multilateral, nonprofit, and for-profit organizations. We thereby derive a framework for how different types of funders should take the beneficiaries of research into account when they allocate scarce research resources. (shrink)
Expanding the scope of existential discourse beyond the Western tradition, this book engages Asian philosophies to reassess vital questions of life's purpose, death's imminence, and our capacity for living meaningfully in conditions of uncertainty. Inspired by European existentialism in theory, the book explores concrete techniques for existential practice via the philosophies of East Asia. The investigation begins with the provocative existential writings of twentieth-century Korean Buddhist nun Kim Iryop, who asserts that meditative concentration conducts a potent energy outward throughout the (...) entire karmic network, enabling the radical transformation of our shared existential conditions. Understanding her claim requires a study of East Asian traditions more broadly. Considering practices as diverse as Song-dynasty Chinese views on mental cultivation, Buddhist merit-making ceremonies, the ritual memorization and recitation of texts, and Yijing divination, the book concludes by advocating a speculative turn. This 'speculative existentialism' counters the suspicion toward metaphysics characteristic of twentieth-century European existential thought and, at the same time, advances a program for action. It is not a how-to guide for living, but rather a philosophical methodology that takes seriously the power of mental cultivation to transform the meaning of the life that we share. (shrink)
We discuss the role of prior authorization (PA) in supporting patient-centered care (PCC) by directing health system resources and thus the ability to better meet the needs of individual patients. We begin with an account of PCC as a standard that should be aimed for in patient care. In order to achieve widespread PCC, appropriate resource management is essential in a healthcare system. This brings us to PA, and we present an idealized view of PA in order to argue how (...) at its best, it can contribute to the provision of PCC. PA is a means of cost saving and as such it has mixed success. The example of the US demonstrates how implementation of PA has increased health inequalities whereas best practice has the potential to reduce them. In contrast, systems of universal coverage, like those in Europe, may use the cost savings of PA to better address individuals' care and PCC. The conclusion we offer therefore is an optimistic one, pointing towards areas of supportive overlap between PCC and PA where usually the incongruities are most evident. (shrink)
We prove that there exists a noncomputable c.e. real which is low for weak 2-randomness, a definition of randomness due to Kurtz, and that all reals which are low for weak 2-randomness are low for Martin-Löf randomness.
This volume introduces readers to the main philosophical issues of measurement in medicine, illustrating the connections between the natural and social sciences by integrating essays on causation, measuring instruments and issues of measurement and policy.
Hierarchical Bayesian models (HBMs) provide an account of Bayesian inference in a hierarchically structured hypothesis space. Scientific theories are plausibly regarded as organized into hierarchies in many cases, with higher levels sometimes called ‘paradigms’ and lower levels encoding more specific or concrete hypotheses. Therefore, HBMs provide a useful model for scientific theory change, showing how higher‐level theory change may be driven by the impact of evidence on lower levels. HBMs capture features described in the Kuhnian tradition, particularly the idea that (...) higher‐level theories guide learning at lower levels. In addition, they help resolve certain issues for Bayesians, such as scientific preference for simplicity and the problem of new theories. *Received July 2009; revised October 2009. †To contact the authors, please write to: Leah Henderson, Massachusetts Institute of Technology, 77 Massachusetts Avenue, 32D‐808, Cambridge, MA 02139; e‐mail: [email protected] (shrink)
Schnorr randomness is a notion of algorithmic randomness for real numbers closely related to Martin-Löf randomness. After its initial development in the 1970s the notion received considerably less attention than Martin-Löf randomness, but recently interest has increased in a range of randomness concepts. In this article, we explore the properties of Schnorr random reals, and in particular the c.e. Schnorr random reals. We show that there are c.e. reals that are Schnorr random but not Martin-Löf random, and provide a new (...) characterization of Schnorr random real numbers in terms of prefix-free machines. We prove that unlike Martin-Löf random c.e. reals, not all Schnorr random c.e. reals are Turing complete, though all are in high Turing degrees. We use the machine characterization to define a notion of "Schnorr reducibility" which allows us to calibrate the Schnorr complexity of reals. We define the class of "Schnorr trivial" reals, which are ones whose initial segment complexity is identical with the computable reals, and demonstrate that this class has non-computable members. (shrink)
We prove that the degree structures of the d.c.e. and the 3-c.e. Turing degrees are not elementarily equivalent, thus refuting a conjecture of Downey. More specifically, we show that the following statement fails in the former but holds in the latter structure: There are degrees f > e > d > 0 such that any degree u ≤ f is either comparable with both e and d, or incomparable with both.
State Medicaid programs have proposed closed formularies to limit spending on drugs. Closed formularies can be justified when they enable spending on other socially valuable aims. However, it is still necessary to justify guidelines informing formulary design, which can be done through a process of decision making that includes the public. This article examines criticisms that Medicaid closed formularies limit deliberation about decisions that affect drug access and unfairly disadvantage poor patients. Although unfairness to poor patients is a risk, it (...) is not a problem unique to Medicaid, since private insurance programs have also implemented closed formularies. (shrink)
We describe new results in parametrized complexity theory. In particular, we prove a number of concrete hardness results for W[P], the top level of the hardness hierarchy introduced by Downey and Fellows in a series of earlier papers. We also study the parametrized complexity of analogues of PSPACE via certain natural problems concerning k-move games. Finally, we examine several aspects of the structural complexity of W [P] and related classes. For instance, we show that W[P] can be characterized in (...) terms of the DTIME ) and NP. (shrink)
This book presents new results in computability theory, a branch of mathematical logic and computer science that has become increasingly relevant in recent years. The field's connections with disparate areas of mathematical logic and mathematics more generally have grown deeper, and now have a variety of applications in topology, group theory, and other subfields. This monograph establishes new directions in the field, blending classic results with modern research areas such as algorithmic randomness. The significance of the book lies not only (...) in the depth of the results contained therein, but also in the fact that the notions the authors introduce allow them to unify results from several subfields of computability theory. (shrink)
This interdisciplinary collection of essays highlights the relevance of Buddhist doctrine and practice to issues of globalization. From philosophical, religious, historical, and political perspectives, the authors show that Buddhism—arguably the world’s first transnational religion—is a rich resource for navigating todays interconnected world.
Two of the most influential theories about scientific inference are inference to the best explanation and Bayesianism. How are they related? Bas van Fraassen has claimed that IBE and Bayesianism are incompatible rival theories, as any probabilistic version of IBE would violate Bayesian conditionalization. In response, several authors have defended the view that IBE is compatible with Bayesian updating. They claim that the explanatory considerations in IBE are taken into account by the Bayesian because the Bayesian either does or should (...) make use of them in assigning probabilities to hypotheses. I argue that van Fraassen has not succeeded in establishing that IBE and Bayesianism are incompatible, but that the existing compatibilist response is also not satisfactory. I suggest that a more promising approach to the problem is to investigate whether explanatory considerations are taken into account by a Bayesian who assigns priors and likelihoods on his or her own terms. In this case, IBE would emerge from the Bayesian account, rather than being used to constrain priors and likelihoods. I provide a detailed discussion of the case of how the Copernican and Ptolemaic theories explain retrograde motion, and suggest that one of the key explanatory considerations is the extent to which the explanation a theory provides depends on its core elements rather than on auxiliary hypotheses. I then suggest that this type of consideration is reflected in the Bayesian likelihood, given priors that a Bayesian might be inclined to adopt even without explicit guidance by IBE. The aim is to show that IBE and Bayesianism may be compatible, not because they can be amalgamated, but rather because they capture substantially similar epistemic considerations. 1 Introduction2 Preliminaries3 Inference to the Best Explanation4 Bayesianism5 The Incompatibilist View : Inference to the Best Explanation Contradicts Bayesianism5. 1 Criticism of the incompatibilist view6 Constraint - Based Compatibilism6. 1 Criticism of constraint - based compatibilism7 Emergent Compatibilism7. 1 Analysis of inference to the best explanation7. 1. 1 Inference to the best explanation on specific hypotheses7. 1. 2 Inference to the best explanation on general theories7. 1. 3 Copernicus versus Ptolemy7. 1. 4 Explanatory virtues7. 1. 5 Summary7. 2 Bayesian account8 Conclusion. (shrink)
BackgroundFunctional neurodiagnostics could allow researchers and clinicians to distinguish more accurately between the unresponsive wakefulness syndrome and the minimally conscious state. It remains unclear how it informs surrogate decision-making.ObjectiveTo explore how the next of kin of patients with disorders of consciousness interpret the results of a functional neurodiagnostics measure and how/why their interpretations influence their attitudes towards medical decisions.Methods and SampleWe conducted problem-centered interviews with seven next of kin of patients with DOC who had undergone a functional HD-EEG examination at (...) a neurological rehabilitation center in Germany. The examination included an auditory oddball paradigm and a motor imagery task to detect hidden awareness. We analyzed the interview transcripts using structuring qualitative content analysis.ResultsRegardless of the diagnostic results, all participants were optimistic of the patients’ meaningful recovery. We hypothesize, that participants deal with the results of examinations according to their belief system. Thus, an unfavorable evaluation of the patient’s state had the potential to destabilize the participant’s belief system. To re-stabilize or to prevent the destabilization of their belief system, participants used different strategies. Participants accepted a “positive” HD-EEG result since it stabilized their belief system.ConclusionWe hypothesize, that a group of next of kin of patients with DOC deals with functional neurodiagnostics results on the basis of the result’s value and their high hope that the patient will recover meaningfully. A psychological mechanism seems to moderate the impact of functional neurodiagnostics on surrogate treatment decisions. (shrink)
The exchange between Peter Park, Dan Flory and Leah Kalmanson on Park’s book Africa, Asia and the History of Philosophy: Racism in the Formation of the Philosophical Canon took place during the APA’s 2016 Central Division meeting on a panel sponsored by the Committee on Asian and Asian-American Philosophers and Philosophies. After having peer-reviewed the exchange, JWP invited Sonia Sikka and Mark Larrimore to engage with these papers. All the five papers are being published together in this issue.
Psychological studies show that the beliefs of two agents in a hypothesis can diverge even if both agents receive the same evidence. This phenomenon of belief polarisation is often explained by invoking biased assimilation of evidence, where the agents’ prior views about the hypothesis affect the way they process the evidence. We suggest, using a Bayesian model, that even if such influence is excluded, belief polarisation can still arise by another mechanism. This alternative mechanism involves differential weighting of the evidence (...) arising when agents have different initial views about the reliability of their sources of evidence. We provide a systematic exploration of the conditions for belief polarisation in Bayesian models which incorporate opinions about source reliability, and we discuss some implications of our findings for the psychological literature. (shrink)
The no miracles argument is one of the main arguments for scientific realism. Recently it has been alleged that the no miracles argument is fundamentally flawed because it commits the base rate fallacy. The allegation is based on the idea that the appeal of the no miracles argument arises from inappropriate neglect of the base rate of approximate truth among the relevant population of theories. However, the base rate fallacy allegation relies on an assumption of random sampling of individuals from (...) the population which cannot be made in the case of the no miracles argument. Therefore the base rate fallacy objection to the no miracles argument fails. I distinguish between a “local” and a “global” form of the no miracles argument. The base rate fallacy objection has been leveled at the local version. I argue that the global argument plays a key role in supporting a base-rate-fallacy-free formulation of the local version of the argument. (shrink)
This volume explores the role of some of the most prominent twentieth-century philosophers and political thinkers as teachers. It examines what obstacles they confronted as teachers and how they overcame them in conveying truth to their students in an age dominated by ideological thinking.