The main idea that we want to defend in this paper is that the question of what a logic is should be addressed differently when structural properties enter the game. In particular, we want to support the idea according to which it is not enough to identify the set of valid inferences to characterize a logic. In other words, we will argue that two logical theories could identify the same set of validities, but not be the same logic.
From a scientific standpoint, the world is more prepared than ever to respond to infectious disease outbreaks; paradoxically, globalization and air travel, antimicrobial resistance, the threat of bioterrorism, and newly emerging pathogens driven by ecological, socioeconomic, and environmental factors, have increased the risk of global epidemics.1,2,3Following the 2002–2003 severe acute respiratory syndrome, global efforts to build global emergency response capabilities to contain infectious disease outbreaks were put in place.4,5,6But the recent H1N1, Ebola, and Zika global epidemics have shown unnecessary delays (...) and insufficient coordination in response efforts.7,8,9,10In a thoughtful and compelling essay,11Thana C. de Campos argues that greater clarity in the definition of pandemics would probably result in more timely effective emergency responses, and pandemic preparedness. In her view, a central problem is that the definition of pandemics is based solely on disease transmission across several countries, and not on spread and severity together, which conflates two very different situations: emergency and nonemergency disease outbreaks. A greater emphasis on severity, such that pandemics are defined as severe and rapidly spreading infectious disease outbreaks, would make them “true global health emergencies,” allowing for priority resource allocation and effective collective actions in emergency response efforts. Sympathetic to the position taken by de Campos, here I highlight some of the challenges in the definition of severity during an infectious disease outbreak. (shrink)
What accounts for the apocalyptic angst that is now so clearly present among Americans who do not subscribe to any religious orthodoxy? Why do so many popular television shows, films, and music nourish themselves on this very angst? And why do so many artists—from Coldplay to Tori Amos to Tom Wolfe—feel compelled to give it expression? It is tempting to say that America’s fears and anxieties are understandable in the light of 9/11, the ongoing War on Terror, nuclear proliferation, and (...) the seemingly limitless capacity of science to continually challenge our conceptions of the universe and ourselves. Perhaps, too, American culture remains so permeated by Protestant Christianity that even avowed skeptics cannot pry themselves from its grip. In _A Consumer’s Guide to the Apocalypse,_ Eduardo Velásquez argues that these answers are too pat. Velásquez’s astonishing thesis is that when we peer into contemporary artists’ creative depiction of our sensibilities we discover that the antagonisms that fuel the current cultural wars stem from the same source. Enthusiastic religions and dogmatic science, the flourishing of scientific reason and the fascination with mystical darkness, cultural triumphalists and multicultural ideologues are all sustained by the same thing: a willful commitment to the basic tenets of the Enlightenment. Velásquez makes his point with insightful readings of the music of Coldplay, Tori Amos, and Dave Matthews and the fiction of Michael Frayn’s _Copenhagen,_ Chuck Palahniuk’s _Fight Club,_ and Tom Wolfe’s _I Am Charlotte Simmons._ Written with grace and humor, and directed toward the lay reader, _A Consumer’s Guide to the Apocalypse_ is a tour de force of cultural analysis. (shrink)
The origins and development of the problem of mental causation are outlined. The underlying presuppositions which give rise to the problem are identified. Possible strategies for solving, or dissolving the problem are examined.
The global economy’s centre of gravity is shifting. Emerging and developing countries have been contributing over 50% of the global GDP since the onset of the 21st century, which is unprecedented since the Industrial Revolution. This article offers the first analysis of the creeping convergence of the BRIC world (ie Brazil, Russia, India and China) with global legal standards in a key area of International Law: the International Tax Regime (ITR). The ITR is a legal technology fundamentally designed by the (...) League of Nations in the 1920s, when the BRICs played no relevant role. This article proposes a theory that aims to illuminate the core driving forces of the on-going trend towards global convergence in this area of International Law from both the static and dynamic dimensions. It is grounded on the logic of two-sided platforms. (shrink)
Nowadays global inequalities in access to vaccines seem to be a growing problem. Intellectual Property Rights have been playing an important role both in causing and worsening them. Firstly,...
In his article “The Status of Content”, Boghossian defends what has been called “trascendentalism about content”. According to him, the thesis that there is nothing in the world that corresponds to our thoughts “is not merely implausible but incoherent”. In other words, he thinks that the thesis in question is not simply false on empirical basis but rather self-refuting or pragmatically incoherent. My purpose in this article is to show that Boghossian´s argument for his point of view is not valid. (...) My main thesis is that there is no contradiction in applying the notion of truth to both semantical and psychological sentences and, at the same time, holding that there are neither semantic nor mental contents. (shrink)
In this article I analyze the class- and cultural-based exclusion produced by the Chilean neoliberal educational reform, carried out during the period from 1990 to 2010. This educational reform follows the same neoliberal model applied to the economy of the country. Although some indicators improved in relation to coverage and public spending in education, the performance gap among social groups increased. In addition, at a cultural level, the reform promoted the value of individual productivity negatively affecting some of the cultural (...) behaviors developed by low-income groups as a consequence of and as reaction to the exclusion they suffer. Namely, by stressing the notions of human capital and quality education, the reform has tended to reinforce these students? fatalism, and limit the scope of the organizations they form to improve their academic and social opportunities. (shrink)
En uno de los ataques más reiterados a Darwin, que todavía subsiste en la literatura actual, se señala que la teoría de la selección natural es tautológica, analítica o, al menos, irrefutable. En docenas de artículos, diversos autores han señalado las condiciones en que la selección natural quedaría refutada, intentando mostrar que no carece de contenido empírico. La estrategia seguida en este trabajo será otra. Teniendo en cuenta que la crítica de tautologicidad o irrefutabilidad ha sido esgrimida contra leyes fundamentales (...) de otras teorías, insertaré la discusión en un marco metateórico más amplio. Para discutir el estatus del segundo principio de la mecánica clásica, Moulines introduce el concepto de "principio guía". Los principios guía no serían contrastables directamente, sino a través de especializaciones. Desde mi punto de vista, considerar a la selección natural como principio guía permitiría explicar por qué muchos la han considerado tautológica y cómo esto no implica vacuidad. One of the most repeated objections to Darwin, still present in contemporary literature, outlines that theory of natural selection is tautological, analytical or at least irrefutable. There are many authors that point out the conditions in which natural selection would be refuted, trying to prove that it doesn't lack of empirical content. This essay will work on a different strategy. Taking into account that tautological or irrefutable criticism has been put forward against other theories' fundamental laws; I will insert the argumentation in a wider theoretical frame. To argue about the status of the second principle about classic mechanical Moulines introduces the 'guiding principle' concept. The guiding principles wouldn't be directly contrastable. The only way in which they could be contrastable would be through special laws. From my point of view natural selection, considered as the guiding principle, would explain why many thinkers had considered it a tautology and how this does not imply vacuity. (shrink)
This book is a collection of secondary essays on America's most important philosophic thinkers—statesmen, judges, writers, educators, and activists—from the colonial period to the present. Each essay is a comprehensive introduction to the thought of a noted American on the fundamental meaning of the American regime.
In this article, we will present a number of technical results concerning Classical Logic, ST and related systems. Our main contribution consists in offering a novel identity criterion for logics in general and, therefore, for Classical Logic. In particular, we will firstly generalize the ST phenomenon, thereby obtaining a recursively defined hierarchy of strict-tolerant systems. Secondly, we will prove that the logics in this hierarchy are progressively more classical, although not entirely classical. We will claim that a logic is to (...) be identified with an infinite sequence of consequence relations holding between increasingly complex relata: formulae, inferences, metainferences, and so on. As a result, the present proposal allows not only to differentiate Classical Logic from ST, but also from other systems sharing with it their valid metainferences. Finally, we show how these results have interesting consequences for some topics in the philosophical logic literature, among them for the debate around Logical Pluralism. The reason being that the discussion concerning this topic is usually carried out employing a rivalry criterion for logics that will need to be modified in light of the present investigation, according to which two logics can be non-identical even if they share the same valid inferences. (shrink)
In some recent articles, Cobreros, Egré, Ripley, & van Rooij have defended the idea that abandoning transitivity may lead to a solution to the trouble caused by semantic paradoxes. For that purpose, they develop the Strict-Tolerant approach, which leads them to entertain a nontransitive theory of truth, where the structural rule of Cut is not generally valid. However, that Cut fails in general in the target theory of truth does not mean that there are not certain safe instances of Cut (...) involving semantic notions. In this article we intend to meet the challenge of answering how to regain all the safe instances of Cut, in the language of the theory, making essential use of a unary recovery operator. To fulfill this goal, we will work within the so-called Goodship Project, which suggests that in order to have nontrivial naïve theories it is sufficient to formulate the corresponding self-referential sentences with suitable biconditionals. Nevertheless, a secondary aim of this article is to propose a novel way to carry this project out, showing that the biconditionals in question can be totally classical. In the context of this article, these biconditionals will be essentially used in expressing the self-referential sentences and, thus, as a collateral result of our work we will prove that none of the recoveries expected of the target theory can be nontrivially achieved if self-reference is expressed through identities. (shrink)
This paper analyzes the theory of area developed by Euclid in the Elements and its modern reinterpretation in Hilbert’s influential monograph Foundations of Geometry. Particular attention is bestowed upon the role that two specific principles play in these theories, namely the famous common notion 5 and the geometrical proposition known as De Zolt’s postulate. On the one hand, we argue that an adequate elucidation of how these two principles are conceptually related in the theories of Euclid and Hilbert is highly (...) relevant for a better understanding of the respective geometrical practices. On the other hand, we claim that these conceptual relations unveil interesting issues between the two main contemporary approaches to the study of area of plane rectilinear figures, i.e., the geometrical approach consisting in the geometrical theory of equivalence and the metrical approach based on the notion of measure of area. Finally, in an appendix logical relations among equivalence, comparison and addition of magnitudes are examined schematically in an abstract setting. (shrink)
In this paper, we present a non-trivial and expressively complete paraconsistent naïve theory of truth, as a step in the route towards semantic closure. We achieve this goal by expressing self-reference with a weak procedure, that uses equivalences between expressions of the language, as opposed to a strong procedure, that uses identities. Finally, we make some remarks regarding the sense in which the theory of truth discussed has a property closely related to functional completeness, and we present a sound and (...) complete three-sided sequent calculus for this expressively rich theory. (shrink)
This volume has 41 chapters written to honor the 100th birthday of Mario Bunge. It celebrates the work of this influential Argentine/Canadian physicist and philosopher. Contributions show the value of Bunge’s science-informed philosophy and his systematic approach to philosophical problems. The chapters explore the exceptionally wide spectrum of Bunge’s contributions to: metaphysics, methodology and philosophy of science, philosophy of mathematics, philosophy of physics, philosophy of psychology, philosophy of social science, philosophy of biology, philosophy of technology, moral philosophy, social and political (...) philosophy, medical philosophy, and education. The contributors include scholars from 16 countries. Bunge combines ontological realism with epistemological fallibilism. He believes that science provides the best and most warranted knowledge of the natural and social world, and that such knowledge is the only sound basis for moral decision making and social and political reform. Bunge argues for the unity of knowledge. In his eyes, science and philosophy constitute a fruitful and necessary partnership. Readers will discover the wisdom of this approach and will gain insight into the utility of cross-disciplinary scholarship. This anthology will appeal to researchers, students, and teachers in philosophy of science, social science, and liberal education programmes. 1. Introduction Section I. An Academic Vocation Section II. Philosophy Section III. Physics and Philosophy of Physics Section IV. Cognitive Science and Philosophy of Mind Section V. Sociology and Social Theory Section VI. Ethics and Political Philosophy Section VII. Biology and Philosophy of Biology Section VIII. Mathematics Section IX. Education Section X. Varia Section XI. Bibliography. (shrink)
A theory of magnitudes involves criteria for their equivalence, comparison and addition. In this article we examine these aspects from an abstract viewpoint, by focusing on the so-called De Zolt’s postulate in the theory of equivalence of plane polygons. We formulate an abstract version of this postulate and derive it from some selected principles for magnitudes. We also formulate and derive an abstract version of Euclid’s Common Notion 5, and analyze its logical relation to the former proposition. These results prove (...) to be relevant for the clarification of some key conceptual aspects of Hilbert’s proof of De Zolt’s postulate, in his classical Foundations of Geometry. Furthermore, our abstract treatment of this central proposition provides interesting insights for the development of a well-behaved theory of compatible magnitudes. (shrink)
In different papers, Carnielli, W. & Rodrigues, A., Carnielli, W. Coniglio, M. & Rodrigues, A. and Rodrigues & Carnielli, present two logics motivated by the idea of capturing contradictions as conflicting evidence. The first logic is called BLE and the second—that is a conservative extension of BLE—is named LETJ. Roughly, BLE and LETJ are two non-classical logics in which the Laws of Explosion and Excluded Middle are not admissible. LETJ is built on top of BLE. Moreover, LETJ is a Logic (...) of Formal Inconsistency. This means that there is an operator that, roughly speaking, identifies a formula as having classical behavior. Both systems are motivated by the idea that there are different conditions for accepting or rejecting a sentence of our natural language. So, there are some special introduction and elimination rules in the theory that are capturing different conditions of use. Rodrigues & Carnielli’s paper has an interesting and challenging idea. According to them, BLE and LETJ are incompatible with dialetheia. It seems to show that these paraconsistent logics cannot be interpreted using truth-conditions that allow true contradictions. In short, BLE and LETJ talk about conflicting evidence avoiding to talk about gluts. I am going to argue against this point of view. Basically, I will firstly offer a new interpretation of BLE and LETJ that is compatible with dialetheia. The background of my position is to reject the one canonical interpretation thesis: the idea according to which a logical system has one standard interpretation. Then, I will secondly show that there is no logical basis to fix that Rodrigues & Carnielli’s interpretation is the canonical way to establish the content of logical notions of BLE and LETJ. Furthermore, the system LETJ captures inside classical logic. Then, I am also going to use this technical result to offer some further doubts about the one canonical interpretation thesis. (shrink)
I propose a deductive-nomological model for mathematical scientific explanation. In this regard, I modify Hempel’s deductive-nomological model and test it against some of the following recent paradigmatic examples of the mathematical explanation of empirical facts: the seven bridges of Königsberg, the North American synchronized cicadas, and Hénon-Heiles Hamiltonian systems. I argue that mathematical scientific explanations that invoke laws of nature are qualitative explanations, and ordinary scientific explanations that employ mathematics are quantitative explanations. I analyse the repercussions of this deductivenomological model (...) on causal explanations. (shrink)
It is widely accepted that classical logic is trivialized in the presence of a transparent truth-predicate. In this paper, we will explain why this point of view must be given up. The hierarchy of metainferential logics defined in Barrio et al. and Pailos recovers classical logic, either in the sense that every classical inferential validity is valid at some point in the hierarchy ), or because a logic of a transfinite level defined in terms of the hierarchy shares its validities (...) with classical logic. Each of these logics is consistent with transparent truth—as is shown in Pailos —, and this suggests that, contrary to standard opinions, transparent truth is after all consistent with classical logic. However, Scambler presents a major challenge to this approach. He argues that this hierarchy cannot be identified with classical logic in any way, because it recovers no classical antivalidities. We embrace Scambler’s challenge and develop a new logic based on these hierarchies. This logic recovers both every classical validity and every classical antivalidity. Moreover, we will follow the same strategy and show that contingencies need also be taken into account, and that none of the logics so far presented is enough to capture classical contingencies. Then, we will develop a multi-standard approach to elaborate a new logic that captures not only every classical validity, but also every classical antivalidity and contingency. As a€truth-predicate can be added to this logic, this result can be interpreted as showing that, despite the claims that are extremely widely accepted, classical logic does not trivialize in the context of transparent truth. (shrink)
In “A new proof of the completeness of the Lukasiewicz axioms” Chang proved that any totally ordered MV-algebra A was isomorphic to the segment \}\) of a totally ordered l-group with strong unit A *. This was done by the simple intuitive idea of putting denumerable copies of A on top of each other. Moreover, he also show that any such group G can be recovered from its segment since \^*}\), establishing an equivalence of categories. In “Interpretation of AF C (...) *-algebras in Lukasiewicz sentential calculus” Mundici extended this result to arbitrary MV-algebras and l-groups with strong unit. He takes the representation of A as a sub-direct product of chains A i, and observes that \ where \. Then he let A * be the l-subgroup generated by A inside \. He proves that this idea works, and establish an equivalence of categories in a rather elaborate way by means of his concept of good sequences and its complicated arithmetics. In this note, essentially self-contained except for Chang’s result, we give a simple proof of this equivalence taking advantage directly of the arithmetics of the the product l-group \, avoiding entirely the notion of good sequence. (shrink)
In this paper we develop a general representation theory for MV-algebras. We furnish the appropriate categorical background to study this problem. Our guide line is the theory of classifying topoi of coherent extensions of universal algebra theories. Our main result corresponds, in the case of MV-algebras and MV-chains, to the representation of commutative rings with unit as rings of global sections of sheaves of local rings. We prove that any MV-algebra is isomorphic to the MV-algebra of all global sections of (...) a sheaf of MV-chains on a compact topological space. This result is intimately related to McNaughton’s theorem, and we explain why our representation theorem can be viewed as a vast generalization of McNaughton’s theorem. In spite of the language used in this abstract, we have written this paper in the hope that it can be read by experts in MV-algebras but not in sheaf theory, and conversely. (shrink)
Resumo: Neste artigo, propõe-se uma confrontação entre a teoria dos signos de Gotthold E. Lessing, tal como exposta em Laocoonte ou sobre as fronteiras da pintura e da poesia, e os dois ensaios de Theodor W. Adorno sobre as relações entre música e pintura. Pretende-se, com isso, demonstrar a presença decisiva de elementos da estética clássica alemã no pensamento adorniano do pós-guerra; em particular, observa-se o modo pelo qual a teoria racionalista de Lessing atua na abordagem dialética adorniana a respeito (...) da irredutibilidade formal dos meios artísticos e das possibilidades de sua convergência. À luz de tal confrontação, discutem-se, em um segundo momento do artigo, os temas da conferência de Adorno de 1966, A arte e as artes, que, em certa medida, consubstancia a discussão dos ensaios anteriores sobre música e pintura. Assinala-se, nesse contexto, a continuidade da posição teórica de Adorno e se apresentam as diferenças entre o processo de pseudomorfose e o de imbricação dos meios artísticos, segundo o filósofo.s: This article presents a comparison of Gotthold E. Lessing’s theory of signs, as found in his Laocoön: an essay on the limits of painting and poetry, and Theodore W. Adorno’s two essays on the relationship between music and painting. Our aim is to point out the decisive influence of German classical aesthetics on Adorno’s post-war aesthetics. Specifically, we discuss how Lessing’s theory functions as a framework for Adorno’s dialectical assessment of the formal specificity of artistic media and their possibilities of convergence in the context of the 1960’s avant-garde. In this context, we discuss the main implications of Adorno’s famous lecture of 1966, Art and the arts, which concerned the process of media convergence that intensified during the 1960’s, as well as the concepts of “overlapping” between artistic media and of “pseudomorphosis”. (shrink)