This article sets out to research and resolve the conceptual lag between the family as defined and recognised in law and the multiplicity of queer constellations of ‘intimate citizenship’ in which families are actually done. The focus is on adult unions outside of conjugal coupledom. The family law practices, and awareness and expectations of adults in such unions were analysed through 21 interviews and the content analysis of 40 documents and were projected against the applicable legal mould. The article then (...) proposes to resolve the existing conceptual lag by advancing a conception of the family as a malleable, open-ended assemblage, in lieu of the current rigid status approach in family law, which is both under- and overinclusive. The proposed conception does justice to the increasing fluidity of family formations, which are not always already domestic, dyadic and sexual. (shrink)
This moral philosophy text with readings embraces Socrates' observation that ethics is "no small matter, but how we ought to live." How ought we to live? This hard question captures the full range of moral inquiry from traditional moral theory to contemporary moral issues, such as abortion, capital punishment, and war. But there is much more to moral philosophy: How should we be as people? When should we forgive? Are we capable of morality? What about non-western ethics? And most distressing (...) of all, why be moral in the first place? These and other challenging questions show the profundity and inescapable importance of moral philosophy for a life worth living. Life's Hardest Questions combines lively and informative introductory discussions with classic and contemporary writings in moral philosophy. (shrink)
Grue-Sørensen’s concept of ’educational teaching’ is traced back to an original infl uence from Herbart and Kant. On this background the article attempts to interpret, how one can understand a concept of educationalteaching today. With that, the concept is shown to have its root in a tradition of general education and Grue-Sørensen is shown to be a Danish representative of this. However, in research programs as well as educational programs this tradition has generally been under increasing pressure the last approximately (...) 30 years. Grue-Sørensen and his possible relevance today is discussed in connection with a potential revitalization of a general educational thinking in our current postmodern epoche of higher education. (shrink)
This open access monograph argues established democratic norms for freedom of expression should be implemented on the internet. Moderating policies of tech companies as Facebook, Twitter and Google have resulted in posts being removed on an industrial scale. While this moderation is often encouraged by governments - on the pretext that terrorism, bullying, pornography, “hate speech” and “fake news” will slowly disappear from the internet - it enables tech companies to censure our society. It is the social media companies who (...) define what is blacklisted in their community standards. And given the dominance of social media in our information society, we run the risk of outsourcing the definition of our principles for discussion in the public domain to private companies. Instead of leaving it to social media companies only to take action, the authors argue democratic institutions should take an active role in moderating criminal content on the internet. To make this possible, tech companies should be analyzed whether they are approaching a monopoly. Antitrust legislation should be applied to bring those monopolies within democratic governmental oversight. Despite being in different stages in their lives, Anne Mette is in the startup phase of her research career, while Frederik is one of the most prolific philosophers in Denmark, the authors found each other in their concern about Free Speech on the internet. The book was originally published in Danish as Dit opslag er blevet fjernet - techgiganter & ytringsfrihed. Praise for 'Your Post has been Removed' "From my perspective both as a politician and as private book collector, this is the most important non-fiction book of the 21st Century. It should be disseminated to all European citizens. The learnings of this book and the use we make of them today are crucial for every man, woman and child on earth. Now and in the future.” Jens Rohde, member of the European Parliament for the Alliance of Liberals and Democrats for Europe “This timely book compellingly presents an impressive array of information and analysis about the urgent threats the tech giants pose to the robust freedom of speech and access to information that are essential for individual liberty and democratic self-government. It constructively explores potential strategies for restoring individual control over information flows to and about us. Policymakers worldwide should take heed!” Nadine Strossen, Professor, New York Law School. Author, HATE: Why We Should Resist It with Free Speech, Not Censorship. (shrink)
The paper gives a detailed reconstruction and discussion of Peirce’s doctrine of propositions, so-called Dicisigns, developed in the years around 1900. The special features different from the logical mainstream are highlighted: the functional definition not dependent upon conscious stances nor human language, the semiotic characterization extending propositions and quasi-propositions to cover prelinguistic and prehuman occurrences of signs, the relations of Dicisigns to the conception of facts, of diagrammatical reasoning, of icons and indices, of meanings, of objects, of syntax in Peirce’s (...) logic-as-semiotics. (shrink)
Artifacts are probably our most obvious everyday encounter with technology. Therefore, a good understanding of the nature of technical artifacts is a relevant part of technological literacy. In this article we draw from the philosophy of technology to develop a conceptualization of technical artifacts that can be used for educational purposes. Furthermore we report a small exploratory empirical study to see to what extent teachers’ intuitive ideas about artifacts match with the way philosophers write about the nature of artifacts. Finally, (...) we suggest a teaching and learning strategy for improving teachers’ concepts of technical artifacts through practical activities. (shrink)
In a recent paper, Jeanne Peijnenburg and David Atkinson [ Studia Logica, 89:333-341 ] have challenged the foundationalist rejection of infinitism by giving an example of an infinite, yet explicitly solvable regress of probabilistic justification. So far, however, there has been no criterion for the consistency of infinite probabilistic regresses, and in particular, foundationalists might still question the consistency of the solvable regress proposed by Peijnenburg and Atkinson.
Artiklen perspektiverer på den almene pædagogiks aktuelle, pressede tilstand, i lyset af de centrale, transnationale udviklingstendenser inden for uddannelsesstyring der har gjort sig gældende de sidste ca. 25-30 år. Dette gøres, ved at tage afsæt i de såkaldte antinomier, som en række pædagogiske forskere har forholdt sig til som definerende for moderne pædagogik. Denne tematik blevet sat på dagsordenen af aktuelle uddannelsesforskere som bl.a. Alexander von Oettingen; Michael Uljens; Lars Løvlie; Gert Biesta, Birgit Schaffar og Dietrich Benner. Oettingen ser nærmere (...) bestemt spørgsmålet om pædagogisk professionalitet som spaltet i fire antinomiske paradokser, der angår Rationalisering ; Pluralisering ; Individualisering og Civilisering. Disse pædagogiske paradoksaliteter har en rod bl.a. hos Kant, Rousseau og Herbart. Artiklen søger gennem fokusering af sådanne pædagogiske antinomier eller paradokser at vise, hvorledes den transnationale uddannelsesrevolution de sidste 25-30 år har udtyndet den almenpædagogiske undermuring af begreberne: I. professionalitet, II. læring, III. autenticitet og IV. dannelse. Dermed fokuseres de almenpædagogiske problemstillinger der kendetegner disse fire begreber. Dette gøres for at vise, hvorledes den transnationale uddannelsesstyring i dette lys udgør en bekymrende afvikling eller sammenklapning af pædagogikkens almene grundtræk. (shrink)
This volume is the first collection of articles dedicated to Wittgenstein s thoughts on colour, focusing in particular on his so-called Remarks on Colour, a piece of writing that has received comparably little attention from Wittgenstein scholars. The articles discuss why Wittgenstein wrote so intensively about colour during the last years of his life andwhat significance these remarks have for understanding his philosophical work in general.".
We characterise the intermediate logics which admit a cut-free hypersequent calculus of the form \, where \ is the hypersequent counterpart of the sequent calculus \ for propositional intuitionistic logic, and \ is a set of so-called structural hypersequent rules, i.e., rules not involving any logical connectives. The characterisation of this class of intermediate logics is presented both in terms of the algebraic and the relational semantics for intermediate logics. We discuss various—positive as well as negative—consequences of this characterisation.
Following Lauwers and Van Liedekerke (1995), this paper explores in a model-theoretic framework the relation between Arrovian aggregation rules and ultraproducts, in order to investigate a source of impossibility results for the case of an infinite number of individuals and an aggregation rule based on a free ultrafilter of decisive coalitions.
This note employs the recently established consistency theorem for infinite regresses of probabilistic justification (Herzberg in Stud Log 94(3):331–345, 2010) to address some of the better-known objections to epistemological infinitism. In addition, another proof for that consistency theorem is given; the new derivation no longer employs nonstandard analysis, but utilises the Daniell–Kolmogorov theorem.
The problem of how to rationally aggregate probability measures occurs in particular when a group of agents, each holding probabilistic beliefs, needs to rationalise a collective decision on the basis of a single ‘aggregate belief system’ and when an individual whose belief system is compatible with several probability measures wishes to evaluate her options on the basis of a single aggregate prior via classical expected utility theory. We investigate this problem by first recalling some negative results from preference and judgment (...) aggregation theory which show that the aggregate of several probability measures should not be conceived as the probability measure induced by the aggregate of the corresponding expected utility preferences. We describe how McConway’s :410–414, 1981) theory of probabilistic opinion pooling can be generalised to cover the case of the aggregation of infinite profiles of finitely additive probability measures, too; we prove the existence of aggregation functionals satisfying responsiveness axioms à la McConway plus additional desiderata even for infinite electorates. On the basis of the theory of propositional-attitude aggregation, we argue that this is the most natural aggregation theory for probability measures. Our aggregation functionals for the case of infinite electorates are neither oligarchic nor integral-based and satisfy a weak anonymity condition. The delicate set-theoretic status of integral-based aggregation functionals for infinite electorates is discussed. (shrink)
This article is an investigation of parallel themes in Heinrich Hertz's philosophy science and Kant's theory of schemata, symbols and regulative ideas. It is argued that Hertz's "pictures" bears close similarities to Kantian "schemata", that is, they are rules linking concepts to intuitions and provide them with their meaning. Kant's distinction between symbols and schemata is discussed and related to Hertz's three pictures of mechanics. It is argued that Hertz considered his own picture of mechanics as symbolic in a different (...) way than the force and energy pictures. In the final part of the article it is described how Harald Høffding soon after the publication of Hertz's Principles of Mechanics developed a general theory of analogical reasoning, relying on the ideas of Hertz and Kant. (shrink)
The rejection of an infinitesimal solution to the zero-fit problem by A. Elga ([2004]) does not seem to appreciate the opportunities provided by the use of internal finitely-additive probability measures. Indeed, internal laws of probability can be used to find a satisfactory infinitesimal answer to many zero-fit problems, not only to the one suggested by Elga, but also to the Markov chain (that is, discrete and memory-less) models of reality. Moreover, the generalization of likelihoods that Elga has in mind is (...) not as hopeless as it appears to be in his article. In fact, for many practically important examples, through the use of likelihoods one can succeed in circumventing the zero-fit problem. 1 The Zero-fit Problem on Infinite State Spaces 2 Elga's Critique of the Infinitesimal Approach to the Zero-fit Problem 3 Two Examples for Infinitesimal Solutions to the Zero-fit Problem 4 Mathematical Modelling in Nonstandard Universes? 5 Are Nonstandard Models Unnatural? 6 Likelihoods and Densities A Internal Probability Measures and the Loeb Measure Construction B The (Countable) Coin Tossing Sequence Revisited C Solution to the Zero-fit Problem for a Finite-state Model without Memory D An Additional Note on ‘Integrating over Densities’ E Well-defined Continuous Versions of Density Functions. (shrink)
This article is an investigation of parallel themes in Heinrich Hertz's philosophy science and Kant's theory of schemata, symbols and regulative ideas. It is argued that Hertz's "pictures" bears close similarities to Kantian "schemata", that is, they are rules linking concepts to intuitions and provide them with their meaning. Kant's distinction between symbols and schemata is discussed and related to Hertz's three pictures of mechanics. It is argued that Hertz considered his own picture of mechanics (the "hidden mass" picture) as (...) symbolic in a different way than the force and energy pictures. In the final part of the article it is described how Harald Høffding soon after the publication of Hertz's Principles of Mechanics developed a general theory of analogical reasoning, relying on the ideas of Hertz and Kant. (shrink)
Two different concepts of iconicity compete in Peirce's diagrammatical logic. One is articulated in his general reflections on the role of diagrams in thought, in what could be termed his diagrammatology — the other is articulated in his construction of Existential Graphs as an iconic system for representing logic. One is operational and defines iconicity in terms of which information may be derived from a given diagram or diagram system — the other has stronger demands on iconicity, adding to the (...) operational criterion a demand for as high a degree of similarity as possible and may be termed optimal iconicity. Peirce himself does not clearly distinguish these two iconicity notions, a fact that has caused some confusion. By isolating them, we get a clearer and more refined conceptual apparatus for analyzing iconic signs, from pictures to logic. (shrink)
This article discusses how to deal with the relations between different cultural perspectives in classrooms, based on a proposal for considering understanding and knowledge as goals of science education, inspired by Dewey’s naturalistic humanism. It thus combines educational and philosophical interests. In educational terms, our concerns relate to how science teachers position themselves in multicultural classrooms. In philosophical terms, we are interested in discussing the relations between belief, understanding, and knowledge under the light of Dewey’s philosophy. We present a synthesis (...) of Dewey’s theory of inquiry through his naturalistic humanism and discuss its implications for the concepts of belief, understanding, and knowledge, as well as for the goals of science teaching. In particular, we highlight problems arising in the context of possible conflicts between scientific and religious claims in the school environment that result from totalitarian positions. We characterize an individual’s position as totalitarian if he or she takes some way of thinking as the only one capable of expressing the truth about all that exists in the world, lacks open-mindedness to understand different interpretative perspectives, and attempts to impose her or his interpretation about the facts to others by violent means or not. From this stance, any other perspective is taken to be false a priori and, accordingly, as a putative target to be suppressed or adapted to the privileged way of thinking. We argue, instead, for a more fallibilist evaluation of our own beliefs and a more respectful appraisal of the diversity of students’ beliefs by both students and teachers. (shrink)
We study a logic for deontic necessity and sufficiency, as originally proposed in van Benthem :36–41, 1979). Building on earlier work in modal logic, we provide a sound and complete axiomatization for it, consider some standard extensions, and study other important properties. After that, we compare this logic to the logic of “obligation as weakest permission” from Anglberger et al. :807–827, 2015).
Based on the conception of life and semiosis as co-extensive an attempt is given to classify cognitive and communicative potentials of species according to the plasticity and articulatory sophistication they exhibit. A clear distinction is drawn between semiosis and perception, where perception is seen as a high-level activity, an integrated product of a multitude of semiotic interactions inside or between bodies. Previous attempts at finding progressive trends in evolution that might justify a scaling of species from primitive to advanced levels (...) have not met with much success, but when evolution is considered in the light of semiosis such a scaling immediately catches the eye. The main purpose of this paper is to suggest a scaling of this progression in semiotic freedom into a series of distinct steps. The elleven steps suggested are: 1) molecular recognition, 2) prokaryote-eukaryote transformation, 3) division of labor in multicellular organisms, 4) from irritability to phenotypic plasticity, 5) sense perception, 6) behavioral choice, 7) active information gathering, 8) collaboration, deception, 9) learning and social intelligence, 10) sentience, 11) consciousness. In light of this, the paper finally discusses the conceptual framework for biosemiotic evolution. The evolution of biosemiotic capabilities does not take the form of an ongoing composition of simple signs into composite wholes. Rather, it takes the shape of the increasing subdivision and control of a primitive, holophrastic perception-action circuit already committed to “proto-propositions” reliably guiding action already in the most primitive species. (shrink)
On the basis of the Suppes–Sneed structuralview of scientific theories, we take a freshlook at the concept of refutability,which was famously proposed by K.R. Popper in 1934 as a criterion for the demarcation of scientific theories from non-scientific ones, e.g., pseudo-scientificand metaphysical theories. By way of an introduction we argue that a clash between Popper and his critics on whether scientific theories are, in fact, refutablecan be partly explained by the fact Popper and his criticsascribed different meanings to the term (...) theoryThen we narrow our attention to one particular theory,namely quantum mechanics, in order to elucidate general matters discussed. We prove that quantum mechanics is irrefutable in a rather straightforward sense, but argue that it is refutable in a more sophisticated sense, which incorporates someobservations obtained by looking closely at the practiceof physics. We shall locate exactly where non-rigourous elements enter the evaluation of a scientific theory – thismakes us see clearly how fruitful mathematics isfor the philosophy of science. (shrink)
BackgroundThe preferable position of Deep Brain Stimulation electrodes is proposed to be located in the dorsolateral subthalamic nucleus to improve general motor performance. The optimal DBS electrode localization for the post-operative improvement of balance and gait is unknown.MethodsIn this single-center, retrospective analyses, 66 Parkinson’s disease patients were assessed pre- and post-operatively by using MDS-UPDRS, freezing of gait score, Giladi’s gait and falls questionnaire and Berg balance scale. The clinical outcome was related to the DBS electrode coordinates in x, y, z (...) plane as revealed by image-based reconstruction. Binomial generalized linear mixed models with fixed-effect variables electrode asymmetry, parkinsonian subtype, medication, age class and clinical DBS induced changes were analyzed.ResultsSubthalamic nucleus-deep brain stimulation improved all motor, balance and FoG scores in MED OFF condition, however there were heterogeneous results in MED ON condition. DBS electrode reconstructed coordinates impacted the responsiveness of axial symptoms. FoG and balance responders showed slightly more medially located STN electrode coordinates and less medio-lateral asymmetry of the electrode reconstructed coordinates across hemispheres compared to non-responders.ConclusionDeep brain stimulation electrode reconstructed coordinates, particularly electrode asymmetry on the medio-lateral axis affected the post-operative responsiveness of balance and FoG symptoms in PD patients. (shrink)
This book investigates the nature of aesthetic experience and aesthetic objects. Written by leading philosophers, psychologists, literary scholars and semioticians, the book addresses two intertwined issues. The first is related to the phenomenology of aesthetic experience: The understanding of how human beings respond to artworks, how we process linguistic or visual information, and what properties in artworks trigger aesthetic experiences. The examination of the properties of aesthetic experience reveals essential aspects of our perceptual, cognitive, and semiotic capacities. The second issue (...) studied in this volume is related to the ontology of the work of art: Written or visual artworks are a specific type of objects, containing particular kinds of representation which elicit a particular kind of experience. The research question explored is: What properties in artful objects trigger this type of experience, and what characterizes representation in written and visual artworks? The volume sets the scene for state-of-the-art inquiries in the intersection between the psychology and ontology of art. The investigations of the relation between the properties of artworks and the characteristics of aesthetic experience increase our insight into what art is. In addition, they shed light on essential properties of human meaning-making in general. (shrink)
This article establishes the existence of a definable , countably saturated nonstandard enlargement of the superstructure over the reals. This nonstandard universe is obtained as the union of an inductive chain of bounded ultrapowers . The underlying ultrafilter is the one constructed by Kanovei and Shelah [10].
This paper formally explores the common ground between mild versions of epistemological coherentism and infinitism; it proposes—and argues for—a hybrid, coherentist–infinitist account of epistemic justification. First, the epistemological regress argument and its relation to the classical taxonomy regarding epistemic justification—of foundationalism, infinitism and coherentism—is reviewed. We then recall recent results proving that an influential argument against infinite regresses of justification, which alleges their incoherence on account of probabilistic inconsistency, cannot be maintained. Furthermore, we prove that the Principle of Inferential Justification (...) has rather unwelcome consequences—formally resembling the Sorites paradox—as soon as it is iterated and combined with a natural Bayesian perspective on probabilistic inferences. We conclude that strong versions of foundationalism and infinitism should be abandoned. Positively, we provide a rough sketch for a graded formal coherence notion, according to which infinite regresses of epistemic justification will often have more than a minimal degree of coherence. (shrink)
This paper aims to show that Selim Berker’s widely discussed prime number case is merely an instance of the well-known generality problem for process reliabilism and thus arguably not as interesting a case as one might have thought. Initially, Berker’s case is introduced and interpreted. Then the most recent response to the case from the literature is presented. Eventually, it is argued that Berker’s case is nothing but a straightforward consequence of the generality problem, i.e., the problematic aspect of the (...) case for process reliabilism (if any) is already captured by the generality problem. (shrink)
SUMMARYThe history of the law of nations is generally seen as a synonym for the history of the laws of war. Yet, a strictly bilateral perspective can distort our interpretation of early modern diplomacy. The Peace of Utrecht inaugurated an era of relative stability in the European state system, based on balance-of-power politics and anti-hegemonic legal argumentation. Incidental conflicts ought to be interpreted against this background. Declarations of war issued in 1718, 1719 and 1733 during the War of the Quadruple (...) Alliance and the Polish Succession should not be read as doctrinal surrogates for trials between two parties, but as manifestos in a European arena. (shrink)
Williamson and others have argued that contextualist theories of the semantics of ‘know’ have a special problem of accounting for our practices of ‘consuming’ knowledge attributions and denials made in other contexts. In what follows, I shall understand the objection as the idea that contextualism has a special problem of accounting for how we are able to acquire epistemically useful information from knowledge claims made in other contexts. I respond to the objection by arguing that the defeasibility of knowledge makes (...) it difficult for everyone to acquire epistemically useful information from knowledge claims made in other contexts, and that there is no special problem for contextualism when it comes to acquiring epistemically useful information from knowledge claims made in other contexts. (shrink)
The defence of The No Alternatives Argument in a recent paper by R. Dawid, S. Hartmann and J. Sprenger rests on the assumption that the number of acceptable alternatives to a scientific hypothesis is independent of the complexity of the scientific problem. This note proves a generalisation of the main theorem by Dawid, Hartmann and Sprenger, where this independence assumption is no longer necessary. Some of the other assumptions are also discussed, and the limitations of the no-alternatives argument are explored.