We present a manuscript of Paul Lorenzen that provides a proof of consistency for elementary number theory as an application of the construction of the free countably complete pseudocomplemented semilattice over a preordered set. This manuscript rests in the Oskar-Becker-Nachlass at the Philosophisches Archiv of Universität Konstanz, file OB 5-3b-5. It has probably been written between March and May 1944. We also compare this proof to Gentzen's and Novikov's, and provide a translation of the manuscript.
Von 1925 bis 1928 wurden im Berliner J. M. Spaeth-Verlag unter der Leitung von Hans Rosenkranz eine Reihe von Werken seinerzeit eher unbekannter, in der Retrospektive jedoch signifikanter Autoren der Zwischenkriegszeit publiziert. Der Beitrag thematisiert Rosenkranz als jungen Verleger und Bewunderer Stefan Zweigs. Er entwirft auf Grundlage der Archivüberlieferung einen neuen Blick auf die Geschichte des Unternehmens und kommentiert das damit verbundene literarische Programm: Welche wichtigen verlegerischen Projekte wurden in jener kurzen Zeit unternommen? Welche Rolle hatte Stefan Zweig (...) für das Zustandekommen einiger Titel und besonders in den letzten Wochen der Verlagsexistenz? Inwiefern lässt sich Programmgestaltung und ökonomische Entwicklung von J. M. Spaeth als paradigmatisch für jüdische Verlage in der Weimarer Republik verstehen? Dazu wird erstmals das Scheitern des Unternehmens während der „Bücherkrise“ Ende der 1920er Jahre aus den Quellen rekonstruiert. (shrink)
A critical pathway for conceptual innovation in the social is the construction of theoretical ideas based on empirical data. Grounded theory has become a leading approach promising the construction of novel theories. Yet grounded theory-based theoretical innovation has been scarce in part because of its commitment to let theories emerge inductively rather than imposing analytic frameworks a priori. We note, along with a long philosophical tradition, that induction does not logically lead to novel theoretical insights. Drawing from the theory of (...) inference, meaning, and action of pragmatist philosopher Charles S. Peirce, we argue that abduction, rather than induction, should be the guiding principle of empirically based theory construction. Abduction refers to a creative inferential process aimed at producing new hypotheses and theories based on surprising research evidence. We propose that abductive analysis arises from actors' social and intellectual positions but can be further aided by careful methodological data analysis. We outline how formal methodological steps enrich abductive analysis through the processes of revisiting, defamiliarization, and alternative casing. (shrink)
How do we get out knowledge of the natural numbers? Various philosophical accounts exist, but there has been comparatively little attention to psychological data on how the learning process actually takes place. I work through the psychological literature on number acquisition with the aim of characterising the acquisition stages in formal terms. In doing so, I argue that we need a combination of current neologicist accounts and accounts such as that of Parsons. In particular, I argue that we learn the (...) initial segment of the natural numbers on the basis of the Fregean definitions, but do not learn the natural number structure as a whole on the basis of Hume's principle. Therefore, we need to account for some of the consistency of our number concepts with the Dedekind-Peano axioms in other terms. (shrink)
According to Hempel’s influential theory of explanation, explaining why some a is G consists in showing that the truth that a is G follows from a law-like generalization to the effect that all Fs are G together with the initial condition that a is F. While Hempel’s overall account is now widely considered to be deeply flawed, the idea that some generalizations play the explanatory role that the account predicts is still often endorsed by contemporary philosophers of science. This idea, (...) however, conflicts with widely shared views in metaphysics according to which the generalization that all Fs are G is partially explained by the fact that a is G. I discuss two solutions to this conflict that have been proposed recently, argue that they are unsatisfactory, and offer an alternative. (shrink)
This paper discusses counterexamples to the thesis that the probabilities of conditionals are conditional probabilities. It is argued that the discrepancy is systematic and predictable, and that conditional probabilities are crucially involved in the apparently deviant interpretations. Furthermore, the examples suggest that such conditionals have a less prominent reading on which their probability is in fact the conditional probability, and that the two readings are related by a simple step of abductive inference. Central to the proposal is a distinction between (...) causal and purely stochastic dependence between variables. (shrink)
This imaginative and unusual book explores the moral sensibilities and cultural assumptions that were at the heart of political debate in Victorian and early twentieth-century Britain. It focuses on the role of intellectuals as public moralists, and suggests ways in which their more formal political theory rested upon habits of response and evaluation that were deeply embedded in wider social attitudes and aesthetic judgements. Stefan Collini examines the characteristic idioms and strategies of argument employed in periodical and polemical writing, (...) and reconstructs the sense of identity and of relation to an audience exhibited by social critics from John Stuart Mill and Matthew Arnold to J. M. Keynes and F. R. Leavis. Dr Collini begins by situating the leading intellectuals in the social and political world of the Victorian governing classes. He explores fundamental values like `altruism', `character', and `manliness', which are revealed as the animating dynamic of much of the political thought of the period. The book assesses the impact of increasing academic specialization across a range of disciplines, and offers an illuminating analysis of the public voice of legal theorists like Maine and Dicey. Through a detailed study of J.S. Mill's posthumous reputation Dr Collini uncovers the process by which the genealogy of images of national cultural identity is established; and he concludes with a provocative exploration of the nationalist significance of what he calls `the Whig interpretation of English literature'. Public Moralists is a subtle and illuminating study by a leading intellectual historian which will redirect debate about the distinctive development of modern English culture. (shrink)
An English double-embedded relative clause from which the middle verb is omitted can often be processed more easily than its grammatical counterpart, a phenomenon known as the grammaticality illusion. This effect has been found to be reversed in German, suggesting that the illusion is language specific rather than a consequence of universal working memory constraints. We present results from three self-paced reading experiments which show that Dutch native speakers also do not show the grammaticality illusion in Dutch, whereas both German (...) and Dutch native speakers do show the illusion when reading English sentences. These findings provide evidence against working memory constraints as an explanation for the observed effect in English. We propose an alternative account based on the statistical patterns of the languages involved. In support of this alternative, a single recurrent neural network model that is trained on both Dutch and English sentences is shown to predict the cross-linguistic difference in the grammaticality effect. (shrink)
The culture of honour hypothesis offers a compelling example of how human psychology differentially adapts to pastoral and horticultural environments. However, there is disagreement over whether this pattern is best explained by a memetic, evolutionary psychological, dual inheritance, or niche construction model. I argue that this disagreement stems from two shortcomings: lack of clarity about the theoretical commitments of these models and inadequate comparative data for testing them. To resolve the first problem, I offer a theoretical framework for deriving competing (...) predictions from each of the four models. In particular, this involves a novel interpretation of the difference between dual inheritance theory and cultural niche construction. I then illustrate a strategy for testing their predictions using data from the Human Relations Area File. Empirical results suggest that the aggressive psychological phenotype typically associated with honour culture is more common among pastoral societies than among horticultural societies. Theoretical considerations suggest that this pattern is best explained as a case of cultural niche construction. (shrink)
For Aristotle, a just political community has to find similarity in difference and foster habits of reciprocity. Conventionally, speech and law have been seen to fulfill this role. This article reconstructs Aristotle’s conception of currency as a political institution of reciprocal justice. By placing Aristotle’s treatment of reciprocity in the context of the ancient politics of money, currency emerges not merely as a medium of economic exchange but also potentially as a bond of civic reciprocity, a measure of justice, and (...) an institution of ethical deliberation. Reconstructing this account of currency in analogy to law recovers the hopes Aristotle placed in currency as a necessary institution particular to the polis as a self-governing political community striving for justice. If currency was a foundational institution, it was also always insufficient, likely imperfect, and possibly tragic. Turned into a tool for the accumulation of wealth for its own sake, currency becomes unjust and a serious threat to any political community. Aristotelian currency can fail precisely because it contains an important moment of ethical deliberation. This political significance of currency challenges accounts of the ancient world as bifurcated between oikos and polis and encourages contemporary political theorists to think of money as a constitutional project that can play an important role in improving reciprocity across society. (shrink)
In the past few decades, a growth in ethical consumerism has led brands to increasingly develop conscientiousness and depict ethical image at a corporate level. However, most of the research studying business ethics in the field of corporate brand management is either conceptual or has been empirically conducted in relation to goods/products contexts. This is surprising because corporate brands are more relevant in services contexts, because of the distinct nature of services and the key role that employees have in the (...) services sector. Accordingly, this article aims at empirically examining the effects of customer perceived ethicality in the context of corporate services brands. Based on data collected for eight service categories using a panel of 2179 customers, the hypothesized structural model is tested using path analysis. The results show that, in addition to a direct effect, customer perceived ethicality has a positive and indirect effect on customer loyalty, through the mediators of customer affective commitment and customer perceived quality. Further, employee empathy positively influences the impact of customer perceived ethicality on customer affective commitment, and customer loyalty positively impacts customer positive word-of-mouth. The first implication of these results is that corporate brand strategy needs to be aligned with human resources policies and practices if brands want to turn ethical strategies into employee behavior. Second, corporate brands should build more authentic communications grounded in their ethical beliefs and supported by evidence from actual employees. (shrink)
It is widely recognized that the innate versus acquired distinction is a false dichotomy. Yet many scientists continue to describe certain traits as “innate” and take this to imply that those traits are not acquired, or “unlearned.” This article asks what cognitive role, if any, the concept of innateness should play in the psychological and behavioural sciences. I consider three arguments for eliminating innateness from scientific discourse. First, the classification of a trait as innate is thought to discourage empirical research (...) into its developmental origin. Second, this concept lumps together a number of different biological properties that ought to be treated as distinct. Third, innateness is associated with the outmoded folk biological theory of essentialism. In response to these objections, I consider two attempts to revise the concept of innateness which aim to make it more suitable for scientific explanation and research. One proposal is that innateness can be defined in terms of the biological property of environmental canalization. On this view, a trait is innate to the extent that it is developmentally buffered against a range of different environments. Another proposal is that innateness serves as an explanatory primitive for cognitive science. This view holds that there exist a sharp boundary between psychological and biological explanations and that to identify a trait as innate means that it falls into the latter explanatory domain. This essay ends with some questions for future research. (shrink)
In this essay Stefan Neubert argues that John Dewey was a philosopher of reconstruction and that the best use we can make of him today is to reconstruct his work in and for our own contexts. Neubert distinguishes three necessary and equally important components of the overall project of reconstructing Deweyan pragmatism: first, to make strong and productive use of the tradition; second, to establish new theoretical links in order to develop new conceptual tools; and third, to reconsider implications (...) of Deweyan pragmatism with a view toward new articulations of human life experience. Neubert then discusses three recent publications in the field of Dewey scholarship—Larry Hickman's Pragmatism as Post‐Postmodernism, Inna Semetsky's Deleuze, Education and Becoming, and David Granger's John Dewey, Robert Pirsig, and the Art of Living—as examples illustrating the importance of each component. During the course of this discussion, Neubert develops some conclusions about the complexity inherent in the comprehensive task of reconstructing Dewey's philosophy today. (shrink)
Need considerations play an important role in empirically informed theories of distributive justice. We propose a concept of need-based justice that is related to social participation and provide an ethical measurement of need-based justice. The β-ε-index satisfies the need-principle, monotonicity, sensitivity, transfer and several »technical« axioms. A numerical example is given.
I argue that difference-making should be a crucial element for evaluating the quality of evidence for mechanisms, especially with respect to the robustness of mechanisms, and that it should take central stage when it comes to the general role played by mechanisms in establishing causal claims in medicine. The difference- making of mechanisms should provide additional compelling reasons to accept the gist of Russo-Williamson thesis and include mechanisms in the protocols for Evidence- Based Medicine, as the EBM+ research group has (...) been advocating. (shrink)
The goal of behavioral economics is to improve the explanatory and predictive power of economics. This can be achieved by using theoretical and methodological resources of psychology. Its fundamental idea is that the relationship between psychology and economics cannot be subsumed under standard philosophical accounts of intertheoretical relations. Philosophical Problems of Behavioral Economics argues that behavioral economics is best understood as an attempt to deidealize economic theory guided by psychological research. Behavioral economics deconstructs the model of decision-making by adding different (...) elements. Based on this understanding behavioral economics has a number of tasks: first, it has to identify which economic theory needs to be challenged; second it aims to identify factors which need to be modelled within economic theories of choice and modify the theory accordingly; and finally, it has to create models that explain economic phenomena based on the new theory. This book analyses the different stages of this deconstruction process and shows how the scientific disciplines of economics and psychology are connected by it. This volume develops a new account of intertheoretical relations based on the idea of deidealization and thus contributes to debates within the philosophy of social science. It is suitable for those who are interested in or study economic theory and philosophy, economic psychology and philosophy of social science. (shrink)
This paper revisits a well-known rebuttal of Peter van Inwagen’s consequence argument. This CS-rebuttal, as I shall call it, focuses on the counterfactual structure of alternative possibilities. It shows that the ability to do otherwise is such that if the agent had exercised it, the distant past and/or the laws of nature would have been different. On the counterfactual scenario, there is, therefore, no need for the agent to exercise an ability to change the past or the laws of nature. (...) I first present van Inwagen’s original version of the consequence argument. After exposing some difficulties with Lewis’ famous version of the CS-rebuttal, I proceed by explaining and defending an older and, in my view, superior version. I subsequently discuss a traditional incompatibilist rejoinder, which insists that the past and the laws of nature are fixed. Although this rejoinder delivers a valid argument against the existence of alternative possibilities, it relies on premises the compatibilist explicitly rejects. The outcome of the debate is therefore properly characterized as a genuine dialectical stalemate between compatibilists and incompatibilists. In the final sections of the paper, I demonstrate that attempts by Fischer, Holliday and Fischer and Pendergraft to move beyond the stalemate in favor of the incompatibilist position all fail. I thereby show that the debate is marred by a misunderstanding of the semantics underlying the backtracking conditionals sometimes associated with the compatibilist position. In view of my arguments, the dialectical stalemate between compatibilists and incompatibilists regarding the counterfactual structure of the ability to do otherwise remains fully intact. (shrink)
According to an increasingly popular view among philosophers of science, both causal and non-causal explanations can be accounted for by a single theory: the counterfactual theory of explanation. A kind of non-causal explanation that has gained much attention recently but that this theory seems unable to account for are grounding explanations. Reutlinger :239-256, 2017) has argued that, despite these appearances to the contrary, such explanations are covered by his version of the counterfactual theory. His idea is supported by recent work (...) on grounding by Schaffer and Wilson who claim there to be a tight connection between grounding and counterfactual dependence. The present paper evaluates the prospects of the idea. We show that there is only a weak sense in which grounding explanations convey information about counterfactual dependencies, and that this fact cannot plausibly be taken to reveal a distinctive feature that grounding explanations share with other kinds of explanations. (shrink)
This paper is devoted to Bolzano’s theory of grounding (Abfolge) in his Wissenschaftslehre. Bolzanian grounding is an explanatory consequence relation that is frequently considered an ancestor of the notion of metaphysical grounding. The paper focuses on two principles that concern grounding in the realm of conceptual sciences and relate to traditionally widespread ideas on explanations: the principles, namely, that grounding orders conceptual truths from simple to more complex ones (Simplicity), and that it comes along with a certain theoretical economy among (...) them (Economy). Being spelled out on the basis of Bolzano’s notion of deducibility (Ableitbarkeit), these principles are revealing for the question to what extent grounding can be considered a formal relation. (shrink)
In this article I confront Jürgen Habermas' deliberative model of democracy with Claude Lefort's analysis of democracy as a regime in which the locus of power remains an empty place. This confrontation reveals several structural similarities between the two authors and explains how the proceduralization of popular sovereignty provides a discourse-theoretical interpretation of the empty place of power. At the same time, Lefort's insistence on the open-ended nature of the democratic struggle also points towards an unresolved tension at the core (...) of Habermas' model between the cognitive nature of deliberation on the one hand and the freedom of moral and political agents on the other. A proper solution of this tension requires a full appreciation of the ineliminable gap between actual and ideal deliberation. Because actual deliberation can never result in an ideal consensus, the actual exercise of democratic power should be understood as an unavoidable interruption of deliberation. Key Words: consensus deliberation democracy empty place of power Jürgen Habermas Claude Lefort. (shrink)
Let us by ‘first-order beliefs’ mean beliefs about the world, such as the belief that it will rain tomorrow, and by ‘second-order beliefs’ let us mean beliefs about the reliability of first-order, belief-forming processes. In formal epistemology, coherence has been studied, with much ingenuity and precision, for sets of first-order beliefs. However, to the best of our knowledge, sets including second-order beliefs have not yet received serious attention in that literature. In informal epistemology, by contrast, sets of the latter kind (...) play an important role in some respectable coherence theories of knowledge and justification. In this paper, we extend the formal treatment of coherence to second-order beliefs. Our main conclusion is that while extending the framework to second-order beliefs sheds doubt on the generality of the notorious impossibility results for coherentism, another problem crops up that might be no less damaging to the coherentist project: facts of coherence turn out to be epistemically accessible only to agents who have a good deal of insight into matters external to their own belief states. (shrink)
This article considers the prospects of inference to the best explanation as a method of confirming causal claims vis-à-vis the medical evidence of mechanisms. I show that IBE is actually descriptive of how scientists reason when choosing among hypotheses, that it is amenable to the balance/weight distinction, a pivotal pair of concepts in the philosophy of evidence, and that it can do justice to interesting features of the interplay between mechanistic and population level assessments.
There are several important criticisms against the unificationist model of scientific explanation: Unification is a broad and heterogeneous notion and it is hard to see how a model of explanation based exclusively on unification can make a distinction between genuine explanatory unification from cases of ordering or classification. Unification alone cannot solve the asymmetry and irrelevance problems. Unification and explanation pull in different directions and should be decoupled, because for good scientific explanation extra ad explanandum information is often required. I (...) am presenting a possible solution to those problems, by focusing on an often overlooked but important element of how theoretic unification is achieved—the conceptual frameworks of theories. The core conceptual assumptions behind theories are decisive for discriminating between explanatory and non-explanatory unification. The conceptual framework is also flexible enough to balance the tension between informativeness and maximum systematization in constructing explanatory inferences. A short case study of orthogenetic and Darwinian explanations in paleontology is presented as an illustration of how my addition to the unificationist model is applicable to a historical debate between rival explanations. (shrink)
The entropy-reduction hypothesis claims that the cognitive processing difficulty on a word in sentence context is determined by the word's effect on the uncertainty about the sentence. Here, this hypothesis is tested more thoroughly than has been done before, using a recurrent neural network for estimating entropy and self-paced reading for obtaining measures of cognitive processing load. Results show a positive relation between reading time on a word and the reduction in entropy due to processing that word, supporting the entropy-reduction (...) hypothesis. Although this effect is independent from the effect of word surprisal, we find no evidence that these two measures correspond to cognitively distinct processes. (shrink)
Psychological distance effects have attracted the attention of behavioral economists in the context of descriptive modeling and behavioral policy. Indeed, psychological distance effects have been shown for an increasing number of domains and applications relevant to economic decision-making. The current paper questions whether these effects are robust enough for economists to apply them to relevant policy questions. We demonstrate systematic replication failures for the distance-from-a-distance effect shown by Maglio et al., and relate them to theoretical arguments suggesting that psychological distance (...) theories are currently too poorly specified to make predictions that are precise enough for economic analyses. (shrink)
reality is a complex affair. It comprises a huge variety of different elements. Importantly, though, reality is not a mere aggregate of its elements but rather a structured whole or system whose building blocks are not all on the same level. Instead, they form hierarchical networks ordered by relations of priority. In such networks, derivative aspects of reality obtain in virtue of their grounds, that is, in virtue of more fundamental aspects of reality that are prior to them.This picture of (...) reality as a structured whole is currently enjoying a renaissance in the works of philosophers such as Kit Fine, Jonathan Schaffer, and many others. But far from being a new picture, it has been widely endorsed throughout the... (shrink)
Regulating the Creative Economy Drastic changes have occurred throughout the past century and the world community is struggling to find the exact concepts to describe, understand and, possibly, govern them. One of the concepts used to describe these changes is the so-called "creative economy". Even though the concept is becoming more frequently used, it lacks a precise definition and its meaning remains elusive. Moreover, the proliferation of related concepts, such as the "experience economy", the "cultural economy", the "knowledge-based economy" and (...) the "creative and cultural industries", further obscures its precise scope and meaning. These concepts are, however, no less elusive, particularly because they are of a dual or oxymoronic character, which variably combines aspects of culture, creativity and intellectual creation on the one hand with those of the economy, business, trade and commerce on the other.In sum, the conceptual uncertainties also translate into major difficulties in finding appropriate regulatory responses in the sphere of law. The aim of the present article is therefore to cast light on the meaning of the concept of the creative economy with a view to paving the way for its better and more efficient regulation in the legal sphere. To this end, the first part offers a comprehensive interpretative analysis of the "creative economy" with a view to establishing its value to the present global governance debate. Based on the evidence that designates the creative economy as an evolving concept requiring a multidisciplinary model for the formulation of an adequate approach in law- and policymaking, the second part discusses some of the creative economy's major implications in the sphere of law. In this regard, several regulatory examples appear to advocate the abandonment of the conventional in favour of a more holistic method of regulation. The article concludes with some recommendations that are deemed useful for further debate and research in this area, which ultimately may contribute to the formulation of the kind of creative laws that are needed for the successful regulation of the creative economy in the future. (shrink)
This volume is the second installment in Stefan Jonsson’s epic study of the crowd and the mass in modern Europe, building on his work in A Brief History of the Masses, which focused on monumental artworks produced in 1789, 1889, and 1989.
Linsky and Zalta try to explain how we can refer to mathematical objects by saying that this happens through definite descriptions which may appeal to mathematical theories. I present two issues for their account. First, there is a problem of finding appropriate pre-conditions to reference, which are currently difficult to satisfy. Second, there is a problem of ensuring the stability of the resulting reference. Slight changes in the properties ascribed to a mathematical object can result in a shift of reference (...) and this leads to various problems, e.g., it makes inferring knowledge much harder than it is. (shrink)
Statements about the behavior of biochemical entities (e.g., about the interaction between two proteins) abound in the literature on molecular biology and are increasingly becoming the targets of information extraction and text mining techniques. We show that an accurate analysis of the semantics of such statements reveals a number of ambiguities that have to be taken into account in the practice of biomedical ontology engineering: Such statements can not only be understood as event reporting statements, but also as ascriptions of (...) dispositions or tendencies that may or may not refer to collectives of interacting molecules or even to collectives of interaction events. (shrink)
Integrating the study of human diversity into the human evolutionary sciences requires substantial revision of traditional conceptions of a shared human nature. This process may be made more difficult by entrenched, 'folkbiological' modes of thought. Earlier work by the authors suggests that biologically naive subjects hold an implicit theory according to which some traits are expressions of an animal's inner nature while others are imposed by its environment. In this paper, we report further studies that extend and refine our account (...) of this aspect of folkbiology. We examine biologically naive subjects' judgments about whether traits of an animal are 'innate', 'in its DNA' or 'part of its nature'. Subjects do not understand these three descriptions to be equivalent. Both innate and in its DNA have the connotation that the trait is species-typical. This poses an obstacle to the assimilation of the biology of polymorphic and plastic traits by biologically naive audiences. Researchers themselves may not be immune to the continuing pull of folkbiological modes of thought. (shrink)