Korean political parties have been organizationally unstable for decades, even after 1987 when a democratic transition from authoritarian military-based politics was achieved. Many studies have argued that the instability has been caused by the Confucian culture of Korean politics. This paper suggests a different view of the political phenomenon: Korean party instability has been due to the of self-interested politicians, rather than Confucian morality. This study examines the proposition with a historical exploration of Korean political parties between 1987 and 2012.
Collapse is a term that has attracted much attention in social science literature in recent years, but there remain substantial areas of disagreement about how it should be understood in historical contexts. More specifically, the use of the term collapse often merely serves to dramatize long-past events, to push human actors into the background, and to mystify the past intellectually. At the same time, since human societies are complex systems, the alternative involves grasping the challenges that a holistic analysis presents, (...) taking account of the many different levels and paces at which societies function, and developing appropriate methods that help to integrate science and history. Often neglected elements in considerations of collapse are the perceptions and beliefs of a historical society and how a given society deals with change; an important facet of this, almost entirely ignored in the discussion, is the understanding of time held by the individuals and social groups affected by change; and from this perspective ‘collapse’ depends very much on perception, including the perceptions of the modern commentator. With this in mind, this article challenges simplistic notions of ‘collapse’ in an effort to encourage a more nuanced understanding of the impact and process of both social and environmental change on past human societies. (shrink)
The aim of this article is to investigate the roles of commutative diagrams (CDs) in a speciﬁc mathematical domain, and to unveil the reasons underlying their effectiveness as a mathematical notation; this will be done through a case study. It will be shown that CDs do not depict spatial relations, but represent mathematical structures. CDs will be interpreted as a hybrid notation that goes beyond the traditional bipartition of mathematical representations into diagrammatic and linguistic. It will be argued that one (...) of the reasons why CDs form a good notation is that they are highly mathematically tractable: experts can obtain valid results by ‘calculating’ with CDs. These calculations, take the form of ‘diagram chases’. In order to draw inferences, experts move algebraic elements around the diagrams. It will be argued that these diagrams are dynamic. It is thanks to their dynamicity that CDs can externalize the relevant reasoning and allow experts to draw conclusions directly by manipulating them. Lastly, it will be shown that CDs play essential roles in the context of proof as well as in other phases of the mathematical enterprise, such as discovery and conjecture formation. (shrink)
Case-based instruction is a stable feature of ethics education, however, little is known about the attributes of the cases that make them effective. Emotions are an inherent part of ethical decision-making and one source of information actively stored in case-based knowledge, making them an attribute of cases that likely facilitates case-based learning. Emotions also make cases more realistic, an essential component for effective case-based instruction. The purpose of this study was to investigate the influence of emotional case content, and complementary (...) socio-relational case content, on case-based knowledge acquisition and transfer on future ethical decision-making tasks. Study findings suggest that emotional case content stimulates retention of cases and facilitates transfer of ethical decision-making principles demonstrated in cases. (shrink)
Organizational leaders face environmental challenges and pressures that put them under ethical risk. Navigating this ethical risk is demanding given the dynamics of contemporary organizations. Traditional models of ethical decision-making (EDM) are an inadequate framework for understanding how leaders respond to ethical dilemmas under conditions of uncertainty and equivocality. Sensemaking models more accurately illustrate leader EDM and account for individual, social, and environmental constraints. Using the sensemaking approach as a foundation, previous EDM models are revised and extended to comprise a (...) conceptual model of leader EDM. Moreover, the underlying factors in the model are highlighted—constraints and strategies. Four trainable, compensatory strategies (emotion regulation, self-reflection, forecasting, and information integration) are proposed and described that aid leaders in navigating ethical dilemmas in organizations. Empirical examinations demonstrate that tactical application of the strategies may aid leaders in making sense of complex and ambiguous ethical dilemmas and promote ethical behavior. Compensatory tactics such as these should be central to organizational ethics initiatives at the leader level. (shrink)
At the age of sixteen, Einstein imagined chasing after a beam of light. He later recalled that the thought experiment had played a memorable role in his development of special relativity. Famous as it is, it has proven difficult to understand just how the thought experiment delivers its results. It fails to generate problems for an ether-based electrodynamics. I propose that Einstein’s canonical statement of the thought experiment from his 1946 “Autobiographical Notes,” makes most sense not as an argument against (...) ether-based electrodynamics, but as an argument against “emission” theories of light. (shrink)
Higher order cognitive processes, including ethical decision making (EDM), are influenced by the experiencing of discrete emotions. Recent research highlights the negative influence one such emotion, anger, has on EDM and its underlying processes. The mechanism, however, by which anger disrupts the EDM has not been investigated. The current study sought to discover whether cognitive appraisals of an emotion-evoking event are the driving mechanisms behind the influence of anger on EDM. One primary (goal obstacle) and one secondary (certainty) appraisal of (...) anger were examined. Study results suggest that appraisals of certainty are the driving mechanism behind the negative relationship between anger and EDM. Certainty appraisals led to less application of EDM-promoting strategies and more unethical social motives. Findings further highlight the value of investigating appraisals of emotional events, given their cognitive nature, for their potential effects on cognitive operations, such as EDM. Future directions and implications are discussed. (shrink)
Several of the contributions to the Lynch et al. Special issue make the claim that conversation-analytic research into epistemics is ‘routinely crafted at the expense of actual, produced and constitutive detail, and what that detail may show us’. Here, we seek to address the inappositeness of this critique by tracing precisely how it is that recognizable actions emerge from distinct practices of interaction. We begin by reviewing some of the foundational tenets of conversation-analytic theory and method – including the relationship (...) between position and composition, and the making of collections – as these appear to be primary sources of confusion for many of the contributors to the Lynch et al. Special Issue. We then target some of the specific arguments presented in the Special Issue, including the alleged ‘over-hearer’s’ writing of metrics, the provision of so-called ‘alternative’ analyses and the supposed ‘crafting’ of generalizations in epistemics research. In addition, in light of Lynch’s more general assertion that conversation analysis has recently been experiencing a ‘rapprochement’ with what he disparagingly refers to as the ‘juggernaut’ of linguistics, we discuss the specific expertise that linguists have to offer in analyzing particular sorts of interactional detail. The article as a whole thus illustrates that, rather than being produced ‘at the expense of actual, produced and constitutive detail’, conversation-analytic findings – including its work in epistemics – are unambiguously anchored in such detail. We conclude by offering our comments as to the link between CA and linguistics more generally, arguing that this relationship has long proven to be – and indeed continues to be – a mutually beneficial one. (shrink)
Throughout much of the 20th Century, the relationship between analytic and continental philosophy has been one of disinterest, caution or hostility. Recent debates in philosophy have highlighted some of the similarities between the two approaches and even envisaged a post-continental and post-analytic philosophy. -/- Opening with a history of key encounters between philosophers of opposing camps since the late 19th Century - from Frege and Husserl to Derrida and Searle - the book goes on to explore in detail the main (...) methodological differences between the two approaches. This covers a very wide range of topics, from issues of style and clarity of exposition to formal methods arising from logic and probability theory. The final section presents a balanced critique of the two schools’ approaches to key issues such as Time, Truth, Subjectivity, Mind and Body, Language and Meaning, and Ethics. -/- Analytic Versus Continental is the first sustained analysis of both approaches to philosophy, examining the limits and possibilities of each. It provides a clear overview of a much-disputed history and, in highlighting the strengths and weaknesses of both traditions, also offers future directions for both continental and analytic philosophy. (shrink)
We might suppose it is not only instrumentally valuable for beliefs to be true, but that it is intrinsically valuable – truth makes a non-derivative, positive contribution to a belief's overall value. Some intrinsic goods are better than others, though, and this article considers the question of how good truth is, compared to other intrinsic goods. I argue that truth is the worst of all intrinsic goods; every other intrinsic good is better than it. I also suggest the best explanation (...) for truth's inferiority is that it is not really an intrinsic good at all. It is intrinsically neutral. (shrink)
An epistemic duty would be a duty to believe, disbelieve, or withhold judgment from a proposition, and it would be grounded in purely evidential or epistemic considerations. If I promise to believe it is raining, my duty to believe is not epistemic. If my evidence is so good that, in light of it alone, I ought to believe it is raining, then my duty to believe supposedly is epistemic. I offer a new argument for the claim that there are no (...) epistemic duties. Though people do sometimes have duties to believe, disbelieve, or withhold judgment from propositions, those duties are never grounded in purely epistemic considerations. (shrink)
In this article, I map current conceptions of cosmopolitanism and sketch distinctions between the concept and humanism and multiculturalism. The differences mirror what I take to be a central motif of cosmopolitanism: the capacity to fuse reflective openness to the new with reflective loyalty to the known. This motif invites a reconsideration of the meaning of culture as well as of the relations between home and the world.
At a time when the analytic/continental split dominates contemporary philosophy, this ambitious work offers a careful and clear-minded way to bridge that divide. Combining conceptual rigor and clarity of prose with historical erudition, A Thing of This World shows how one of the standard issues of analytic philosophy—realism and anti-realism—has also been at the heart of continental philosophy. Using a framework derived from prominent analytic thinkers, Lee Braver traces the roots of anti-realism to Kant's idea that the mind actively organizes (...) experience. He then shows in depth and in detail how this idea evolves through the works of Hegel, Nietzsche, Heidegger, Foucault, and Derrida. This narrative presents an illuminating account of the history of continental philosophy by explaining how these thinkers build on each other's attempts to develop new concepts of reality and truth in the wake of the rejection of realism. Braver demonstrates that the analytic and continental traditions have been discussing the same issues, albeit with different vocabularies, interests, and approaches. By developing a commensurate vocabulary, his book promotes a dialogue between the two branches of philosophy in which each can begin to learn from the other. (shrink)
Two compelling principles, the Reasonable Range Principle and the Preservation of Irrelevant Evidence Principle, are necessary conditions that any response to peer disagreements ought to abide by. The Reasonable Range Principle maintains that a resolution to a peer disagreement should not fall outside the range of views expressed by the peers in their dispute, whereas the Preservation of Irrelevant Evidence Principle maintains that a resolution strategy should be able to preserve unanimous judgments of evidential irrelevance among the peers. No standard (...) Bayesian resolution strategy satisfies the PIE Principle, however, and we give a loss aversion argument in support of PIE and against Bayes. The theory of imprecise probability allows one to satisfy both principles, and we introduce the notion of a set-based credal judgment to frame and address a range of subtle issues that arise in peer disagreements. (shrink)
To arrive at their final evaluation of a manuscript or grant proposal, reviewers must convert a submission’s strengths and weaknesses for heterogeneous peer review criteria into a single metric of quality or merit. I identify this process of commensuration as the locus for a new kind of peer review bias. Commensuration bias illuminates how the systematic prioritization of some peer review criteria over others permits and facilitates problematic patterns of publication and funding in science. Commensuration bias also foregrounds a range (...) of structural strategies for realigning peer review practices and institutions with the aims of science. (shrink)
This paper argues against the realization principle, which reifies the realization relation between lower-level and higher-level properties. It begins with a review of some principles of naturalistic metaphysics. Then it criticizes some likely reasons for embracing the realization principle, and finally it argues against the principle directly. The most likely reasons for embracing the principle depend on the dubious assumption that special science theories cannot be true unless special science predicates designate properties. The principle itself turns out to be false (...) because the realization relation fails the naturalistic test for reality: it makes no causal difference to the world.1 1This paper resulted from work done at John Heil's 2006 Mind and Metaphysics NEH Summer Seminar at Washington University in St. Louis. An early version of it was presented in a special symposium on realization at the 2007 meeting of the Southern Society for Philosophy and Psychology. I owe thanks to all the participants in both events for helpful discussions, and I owe particular thanks to Ken Aizawa, Torin Alter, Jason Ford, Carl Gillett, John Heil, Nicholas Helms, Pete Mandik, John Post, Gene Witmer, Michelle Wrenn, Tad Zawidzki, and two anonymous referees for the AJP. (shrink)
Ludwig Wittgenstein and Martin Heidegger are two of the most important--and two of the most difficult--philosophers of the twentieth century, indelibly influencing the course of continental and analytic philosophy, respectively. In _ Groundless Grounds_, Lee Braver argues that the views of both thinkers emerge from a fundamental attempt to create a philosophy that has dispensed with everything transcendent so that we may be satisfied with the human. Examining the central topics of their thought in detail, Braver finds that Wittgenstein and (...) Heidegger construct a philosophy based on _original_ _finitude_--finitude without the contrast of the infinite. In Braver's elegant analysis, these two difficult bodies of work offer mutual illumination rather than compounded obscurity. Moreover, bringing the most influential thinkers in continental and analytic philosophy into dialogue with each other may enable broader conversations between these two divergent branches of philosophy. Braver's meticulously researched and strongly argued account shows that both Wittgenstein and Heidegger strive to construct a new conception of reason, free of the illusions of the past and appropriate to the kind of beings that we are. Readers interested in either philosopher, or concerned more generally with the history of twentieth-century philosophy as well as questions of the nature of reason, will find _Groundless Grounds _of interest. (shrink)
This paper argues against the almost universally held view that truth is an instrumentally valuable property of beliefs. For truth to be instrumentally valuable in the way usually supposed, it must play a causal role in the satisfaction of our desires. As it happens, truth can play no such role, because it is screened off from causal relevance by some of the truth-like properties first discussed by Stephen Stich. Because it is not causally relevant to the success of our actions, (...) truth is not instrumentally valuable in the way usually supposed. (shrink)
Earman and Ruetsche () have cast their gaze upon existing no-go theorems for relativistic modal interpretations, and have found them inconclusive. They suggest that it would be more fruitful to investigate modal interpretations proposed for "really relativistic theories," that is, algebraic relativistic quantum field theories. They investigate the proposal of Clifton (), and extend Clifton's result that, for a host of states, his proposal yields no definite observables other than multiples of the identity. This leads Earman and Ruetsche to a (...) suspicion that troubles for modal interpretations of such relativistic theories "are due less to the Poincaré invariance of relativistic QFT vs. the Galilean invariance of ordinary nonrelativistic QM than to the infinite number of degrees of freedom of former vs. the finite number of degrees of freedom of the latter" (577-78). I am skeptical of this suggestion. Though there are troubles for modal interpretations of a relativistic quantum field theory that are due to its being a field theory—that is, due to infinitude of the degrees of freedom—they are not the only troubles faced by modal interpretations of quantum theories set in relativistic spacetime; there are also troubles traceable to relativistic causal structure. (shrink)
:In this article, we carefully examine two important implementation issues when estimating propensity scores using generalized boosted models, a promising machine learning technique. First, we examine which of the following methods for tuning GBM lead to better covariate balance and inferences about causal effects: pursuing covariate balance between the treatment groups or tuning the propensity score model on the basis of a model fit criterion. Second, we examine how well GBM can handle irrelevant covariates that are included in the estimation (...) model. We find that chasing balance rather than model fit when estimating propensity scores yielded better covariate balance and more accurate treatment effect estimates. Additionally, we find that adding irrelevant covariates to GBM increased imbalance and bias in the treatment effects. The findings from this paper have useful implications for other work focused on improving methods for estimating propensity scores. (shrink)
The following four assumptions plausibly describe the ideal rational agent. (1) She knows what her beliefs are. (2) She desires to believe only truths. (3) Whenever she desires that P → Q and knows that P, she desires that Q. (4) She does not both desire that P and desire that ~P, for any P. Although the assumptions are plausible, they have an implausible consequence. They imply that the ideal rational agent does not believe and desire contradictory propositions. She neither (...) desires the world to be any different than she thinks it is, nor thinks it is any different than she desires it to be. The problem of preserving our intuitions about desire, without embracing the implausible conclusion, is what I call “the Wishful Thinking Puzzle.” In this paper, I examine how this puzzle arises, and I argue that it is surprisingly difficult to solve. Even the decision theoretic conception of desire is not immune to the puzzle. One approach, the contrastive conception of desire, does avoid the puzzle without being ad hoc, but it remains too inchoate to win our full confidence. (shrink)
In this paper, I consider an argument of Harvey Siegel's according to which there can be no hypothetical normativity anywhere unless there is categorical normativity in epistemology. The argument fails because it falsely assumes people must be bound by epistemic norms in order to have justified beliefs.
Psychometrically oriented researchers construe low inter-rater reliability measures for expert peer reviewers as damning for the practice of peer review. I argue that this perspective overlooks different forms of normatively appropriate disagreement among reviewers. Of special interest are Kuhnian questions about the extent to which variance in reviewer ratings can be accounted for by normatively appropriate disagreements about how to interpret and apply evaluative criteria within disciplines during times of normal science. Until these empirical-cum-philosophical analyses are done, it will remain (...) unclear the extent to which low inter-rater reliability measures represent reasonable disagreement rather than arbitrary differences between reviewers. (shrink)
While applied epistemology has been neglected for much of the twentieth century, it has seen emerging interest in recent years, with key thinkers in the field helping to put it on the philosophical map. Although it is an old tradition, current technological and social developments have dramatically changed both the questions it faces and the methodology required to answer those questions. Recent developments also make it a particularly important and exciting area for research and teaching in the twenty-first century. The (...) Routledge Handbook of Applied Epistemology is an outstanding reference source to this exciting subject and the first collection of its kind. Comprising entries by a team of international contributors, the Handbook is divided into six main parts: The Internet Politics Science Epistemic institutions Individual investigators Theory and practice in philosophy. Within these sections, the core topics and debates are presented, analyzed, and set into broader historical and disciplinary contexts. The central topics covered include: the prehistory of applied epistemology, expertise and scientific authority, epistemic aspects of political and social philosophy, epistemology and the law, and epistemology and medicine. Essential reading for students and researchers in epistemology, political philosophy, and applied ethics the Handbook will also be very useful for those in related fields, such as law, sociology, and politics. (shrink)
Under the traditional system of peer-reviewed publication, the degree of prestige conferred to authors by successful publication is tied to the degree of the intellectual rigor of its peer review process: ambitious scientists do well professionally by doing well epistemically. As a result, we should expect journal editors, in their dual role as epistemic evaluators and prestige-allocators, to have the power to motivate improved author behavior through the tightening of publication requirements. Contrary to this expectation, I will argue that the (...) publication bias literature in academic medicine demonstrates that editor interventions have had limited effectiveness in improving the health of the publication and trial registration record, suggesting that much stronger interventions are needed. (shrink)
Laurence BonJour has recently proposed a novel and interesting approach to the problem of induction. He grants that it is contingent, and so not a priori, that our patterns of inductive inference are reliable. Nevertheless, he claims, it is necessary and a priori that those patterns are highly likely to be reliable, and that is enough to ground an a priori justification induction. This paper examines an important defect in BonJour's proposal. Once we make sense of the claim that inductive (...) inference is "necessarily highly likely" to be reliable, we find that it is not knowable a priori after all. (shrink)
This study examined the role of key causal analysis strategies in forecasting and ethical decision-making. Undergraduate participants took on the role of the key actor in several ethical problems and were asked to identify and analyze the causes, forecast potential outcomes, and make a decision about each problem. Time pressure and analytic mindset were manipulated while participants worked through these problems. The results indicated that forecast quality was associated with decision ethicality, and the identification of the critical causes of the (...) problem was associated with both higher quality forecasts and higher ethicality of decisions. Neither time pressure nor analytic mindset impacted forecasts or ethicality of decisions. Theoretical and practical implications of these findings are discussed. (shrink)
What is truth? Is there anything that all truths have in common that makes them true rather than false? Is truth independent of human thought, or does it depend in some way on what we believe or what we would be justified in believing? In what sense, if any, is it better for beliefs or statements to be true than to be false? In this engaging and accessible new introduction Chase Wrenn surveys a variety of theories of the nature (...) of truth and evaluates their philosophical costs and benefits. Paying particular attention to how the theories accommodate realist intuitions and make sense of truth’s value, he discusses a full range of theories from classical correspondence to relatively new deflationary and pluralist accounts. The book provides a clear, non-technical entry point to contemporary debates about truth for non-specialists. Specialists will also find new contributions to those debates, including a new argument for the superiority of deflationism to causal correspondence and pluralist theories. Drawing on a range of traditional and contemporary debates, this book will be of interest to students and scholars alike and anyone interested in the nature and value of truth. (shrink)
According to a common objection to epistemological naturalism, no empirical, scientific theory of knowledge can be normative in the way epistemological theories need to be. In response, such naturalists as W.V. Quine have claimed naturalized epistemology can be normative by emulating engineering disciplines and addressing the relations of causal efficacy between our cognitive means and ends. This paper evaluates that "engineering reply" and finds it a mixed success. Based on consideration of what it might mean to call a theory "normative," (...) seven versions of the normativity objection to epistemological naturalism are formulated. The engineering reply alone is sufficient to answer only the four least sophisticated versions. To answer the others, naturalists must draw on more resources than their engineering reply alone provides. (shrink)
ABSTRACT: Epistemic duties would be duties to believe, disbelieve, or withhold judgement from propositions, and they would be grounded in purely evidential considerations. I offer a new argument for the claim that there are no epistemic duties. Though people may have duties to believe, disbelieve, or withhold judgement from propositions, those duties are never grounded in purely epistemic considerations. Rather, allegedly epistemic duties are a species of moral duty.RÉSUMÉ: Les fonctions épistémiques sont censées désigner le fait de croire ou de (...) ne pas croire des propositions, ou de suspendre notre jugement, et seraient fondées uniquement sur la prise en compte de l’évidence. Je présente un nouvel argument soutenant que les fonctions épistémiques n’existent pas. Bien que nous devions recourir aux fonctions de croire ou de ne pas croire des propositions, ou de suspendre notre jugement, ces fonctions ne sont jamais fondées sur d’uniques considérations épistémiques. Ce que l’on reconnaît comme des fonctions épistémiques appartient plutôt à l’espèce des fonctions morales. (shrink)
Cases have been employed across multiple disciplines, including ethics education, as effective pedagogical tools. However, the benefit of case-based learning in the ethics domain varies across cases, suggesting that not all cases are equal in terms of pedagogical value. Indeed, case content appears to influence the extent to which cases promote learning and transfer. Consistent with this argument, the current study explored the influences of contextual and personal factors embedded in case content on ethical decision-making. Cases were manipulated to include (...) a clear description of the social context and the goals of the characters involved. Results indicated that social context, specifically the description of an autonomy-supportive environment, facilitated execution of sensemaking processes and resulted in greater decision ethicality. Implications for designing optimal cases and case-based training programs are discussed. (shrink)
C. S. Peirce once defined pragmatism as the opinion that metaphysics is to be largely cleared up by the application of the following maxim for attaining clearness of apprehension: ‘Consider what effects that might conceivably have practical bearings we conceive the object of our conception to have. Then, our conception of these effects is the whole of our conception of the object.’ (Peirce 1982a: 48) More succinctly, Richard Rorty has described the position in this way.