The use of assistance systems aimed at reducing road fatalities is spreading, especially for car drivers, but less effort has been devoted to developing and testing similar systems for powered two-wheelers. Considering that over speeding represents one of the main causal factors in road crashes and that riders are more vulnerable than drivers, in the present study we investigated the effectiveness of an assistance system which signaled speed limit violations during a simulated moped-driving task, in optimal and poor visibility conditions. (...) Participants performed four conditions of simulated riding: one baseline condition without Feedback, one Fog condition in which visual feedback was provided so as to indicate to the participants when a speed limit was exceeded, and two post-Feedback conditions with and without Fog, respectively, in which no feedback was delivered. Results showed that participants make fewer speeding violations when the feedback is not provided, after 1 month, and regardless of the visibility condition. Finally, the feedback has been proven effective in reducing speed violations in participants with an aggressive riding style, as measured in the baseline session. (shrink)
MĂDĂLINA DIACONU, Tasten, Riechen, Schmecken. Eine Ästhetik der anästhesierten Sinne, 2005 ; SILVIA STOLLER, VERONICA VASTERLING,LINDA FISHER, Feministische Phänomenologie und Hermeneutik, 2005 ; KARL SCHUHMANN, Karl Schuhmann: Selected Papers on Phenomenology. Edited by CEES LEIJENHORST and PIET STEENBAKKERS, 2004 ; HIROSHI GOTO, Der Begriff der Person in der PhänomenologieHusserls. Ein Interpretationsversuch der Husserlschen Phänomenologie als Ethik im Hinblick auf den Begriff der Habitualität, 2004 ; GÜNTER FIGAL, Lebensverstricktheit und Abstandsnahme. „Verhalten zu sich“ im Anschluss an Heidegger, Kierkegaard und Hegel, 2001 (...) ; JACQUES DERRIDA, Le toucher, Jean-Luc Nancy, 2000. (shrink)
One central tenet of the Modern Evolutionary Synthesis , and the consensus view among biologists until now, is that all genetic mutations occur by “chance” or at “random” with respect to adaptation. However, the discovery of some molecular mechanisms enhancing mutation rate in response to environmental conditions has given rise to discussions among biologists, historians and philosophers of biology about the “chance” vs “directed” character of mutations . In fact, some argue that mutations due to a particular kind of mutator (...) mechanisms challenge the Modern Synthesis because they are produced when and where needed by the organisms concerned. This paper provides a defense of the Modern Synthesis’ consensus view about the chance nature of all genetic mutations by reacting to Jablonka and Lamb’s analysis of genetic mutations and the explicit Lamarckian flavor of their arguments. I argue that biologists can continue to talk about chance mutations according to what I call and define as the notion of “evolutionary chance,” which I claim is the Modern Synthesis’ consensus view and a reformulation of Darwin’s most influential idea of “chance” variation. Advances in molecular genetics are therefore significant but not revolutionary with respect to the Modern Synthesis’ paradigm. (shrink)
Sociopolitical attitudes are often the root cause of conflicts between individuals, groups, and even nations, but little is known about the origin of individual differences in sociopolitical orientation. We test a combination of economic and evolutionary ideas about the degree to which the mating market, sex, age, and income affect sociopolitical orientation. We collected data online through Amazon’s Mechanical Turk from 1108 US participants who were between 18 and 60, fluent in English, and single. While ostensibly testing a new online (...) dating website, participants created an online dating profile and described people they would like to date. We manipulated the participants’ popularity in the mating market and the size of the market and then measured participants’ sociopolitical attitudes. The sociopolitical attitudes were reduced to five dimensions via Principal Components Analysis. Both manipulations affected attitudes toward wealth redistribution but were largely not significant predictors of the other dimensions. Men reported more unrestricted sociosexual attitudes, and more support for benevolent sexism and traditional family values, than women did, and women supported wealth redistribution more than men did. There was no sex difference in accepting nonconforming behaviors. Younger people and people with lower incomes were more liberal than older people and people with higher incomes, respectively, regardless of sex. Overall, effects were largely not interactive, suggesting that individual differences in sociopolitical orientation may reflect strategic self-interest and be more straightforward than previously predicted. (shrink)
Metaphor has been considered as a cognitive process, independent of the verbal versus visual mode, through which an unknown conceptual domain is understood in terms of another known conceptual domain. Metaphor might instead be viewed as a cognitive process, dependent on the mode, which leads to genuinely new knowledge via ignorance. First, I argue that there are two main senses of ignorance at stake when we understand a metaphor: we ignore some existing properties of the known domain in the sense (...) that we disregard or neglect them; we ignore some “non-existing” properties of the known domain in the sense that they are not a piece of information belonging to the known domain, but emerge in metaphor interpretation. Secondly, I consider a metaphor as a reasoning device, guiding the interpreters along a path of inferences to a conclusion, which attributes to the target some properties of the source. In this path, interpreters might discover the ignored existing properties of the known domain and/or recover the “non-existing” properties, inferring or imagining the missing piece of information. Finally, I argue that, especially in visual metaphors, this process is guided by a “sentiment of rationality”, tracking a disruption of existing familiar conceptualisations of objects and/or actions and a recovery of ignored properties. (shrink)
The aim of this paper is to provide a definition of the the notion of complete and immediate formal grounding through the concepts of derivability and complexity. It will be shown that this definition yields a subtle and precise analysis of the concept of grounding in several paradigmatic cases.
This book offers a reconstruction of the debate on non-Euclidean geometry in neo-Kantianism between the second half of the nineteenth century and the first decades of the twentieth century. Kant famously characterized space and time as a priori forms of intuitions, which lie at the foundation of mathematical knowledge. The success of his philosophical account of space was due not least to the fact that Euclidean geometry was widely considered to be a model of certainty at his time. However, such (...) later scientific developments as non-Euclidean geometries and Einstein’s general theory of relativity called into question the certainty of Euclidean geometry and posed the problem of reconsidering space as an open question for empirical research. The transformation of the concept of space from a source of knowledge to an object of research can be traced back to a tradition, which includes such mathematicians as Carl Friedrich Gauss, Bernhard Riemann, Richard Dedekind, Felix Klein, and Henri Poincaré, and which finds one of its clearest expressions in Hermann von Helmholtz’s epistemological works. Although Helmholtz formulated compelling objections to Kant, the author reconsiders different strategies for a philosophical account of the same transformation from a neo-Kantian perspective, and especially Hermann Cohen’s account of the aprioricity of mathematics in terms of applicability and Ernst Cassirer’s reformulation of the a priori of space in terms of a system of hypotheses. This book is ideal for students, scholars and researchers who wish to broaden their knowledge of non-Euclidean geometry or neo-Kantianism. (shrink)
In Poggiolesi we have introduced a rigorous definition of the notion of complete and immediate formal grounding; in the present paper our aim is to construct a logic for the notion of complete and immediate formal grounding based on that definition. Our logic will have the form of a calculus of natural deduction, will be proved to be sound and complete and will allow us to have fine-grained grounding principles.
At Columbia University in 1906, William James gave a highly confrontational speech to the American Philosophical Association (APA). He ignored the technical philosophical questions the audience had gathered to discuss and instead addressed the topic of human energy. Tramping on the rules of academic decorum, James invoked the work of amateurs, read testimonials on the benefits of yoga and alcohol, and concluded by urging his listeners to take up this psychological and physiological problem. What was the goal of this unusual (...) speech? Rather than an oddity, Francesca Bordogna asserts that the APA address was emblematic—it was just one of many gestures that James employed as he plowed through the barriers between academic, popular, and pseudoscience, as well as the newly emergent borders between the study of philosophy, psychology, and the “science of man.” Bordogna reveals that James’s trespassing of boundaries was an essential element of a broader intellectual and social project. By crisscrossing divides, she argues, James imagined a new social configuration of knowledge, a better society, and a new vision of the human self. As the academy moves toward an increasingly interdisciplinary future, William James at the Boundaries reintroduces readers to a seminal influence on the way knowledge is pursued. (shrink)
This paper is a critical response to Andreas Bartels’ sophisticated defense of a structural account of scientific representation. We show that, contrary to Bartels’ claim, homomorphism fails to account for the phenomenon of misrepresentation. Bartels claims that homomorphism is adequate in two respects. First, it is conceptually adequate, in the sense that it shows how representation differs from misrepresentation and non-representation. Second, if properly weakened, homomorphism is formally adequate to accommodate misrepresentation. We question both claims. First, we show that homomorphism (...) is not the right condition to distinguish representation from misrepresentation and non-representation: a “representational mechanism” actually does all the work, and it is independent of homomorphism – as of any structural condition. Second, we test the claim of formal adequacy against three typical kinds of inaccurate representation in science which, by reference to a discussion of the notorious billiard ball model, we define as abstraction, pretence, and simulation. We first point out that Bartels equivocates between homomorphism and the stronger condition of epimorphism, and that the weakened form of homomorphism that Bartels puts forward is not a morphism at all. After providing a formal setting for abstraction, pretence and simulation, we show that for each morphism there is at least one form of inaccurate representation which is not accommodated. We conclude that Bartels’ theory – while logically laying down the weakest structural requirements – is nonetheless formally inadequate in its own terms. This should shed serious doubts on the plausibility of any structural account of representation more generally. (shrink)
Unethical and dishonest behavior has increasingly attracted the attention of scholars from various disciplines. Recent work has begun to focus on a previous overlooked factor predicting dishonest behavior: the beneficiary or victim of dishonest acts. In two laboratory experiments, we manipulate the level of resources allocated to our participants (their "wealth") and investigate whether perceived inequity from wealth that is randomly or subjectively assigned leads individuals to cross ethical boundaries through helping or hurting others. The results show that dishonest behavior (...) is influenced by positive and negative inequity that motivates helping and hurting acts. Furthermore, a third experiment shows that people tend to discount the wrongness of crossing ethical boundaries to hurt or help others when the action restores equity. (shrink)
Our perception of where touch occurs on our skin shapes our interactions with the world. Most accounts of cutaneous localisation emphasise spatial transformations from a skin-based reference frame into body-centred and external egocentric coordinates. We investigated another possible method of tactile localisation based on an intrinsic perception of ‘skin space’. The arrangement of cutaneous receptive fields (RFs) could allow one to track a stimulus as it moves across the skin, similarly to the way animals navigate using path integration. We applied (...) curved tactile motions to the hands of human volunteers. Participants identified the location midway between the start and end points of each motion path. Their bisection judgements were systematically biased towards the integrated motion path, consistent with the characteristic inward error that occurs in navigation by path integration. We thus showed that integration of continuous sensory inputs across several tactile RFs provides an intrinsic mechanism for spatial perception. (shrink)
Cryonics—also known as cryopreservation or cryosuspension—is the preservation of legally dead individuals at ultra-low temperatures. Those who undergo this procedure hope that future technology will not only succeed in reviving them, but also cure them of the condition that led to their demise. In this sense, some hope that cryopreservation will allow people to continue living indefinitely. This book discusses the moral concerns of cryonics, both as a medical procedure and as an intermediate step toward life extension. In particular, Minerva (...) analyses the moral issues surrounding cryonics-related techniques by focusing on how they might impact the individuals who undergo cryosuspension, as well as society at large. (shrink)
The contradictions between food poverty affecting a large section of the global population and the everyday wastage of food, particularly in high income countries, have raised significant academic and public attention. All actors in the food chain have a role to play in food waste prevention and reduction, including farmers, food manufacturers and processors, caterers and retailers and ultimately consumers. Food surplus redistribution is considered by many as a partial solution to food waste reduction and food poverty mitigation, while others (...) criticize charitable initiatives as inadequate responses, that inhibit governments from responsibly protecting the citizens right to food. This paper frames food assistance as “hybrid systems”, situating at the intersection of territorial food, public welfare and third sector voluntary systems. Based on available literature and reflections on previous research examining food banks in Italy, we develop a system dynamics conceptual mapping. The aim is to model a set of relations and dynamic mechanisms associated with variables relevant to food waste generation, food recovery for social purposes and food poverty alleviation. The analysis of feedback interactions highlights the vulnerabilities of food assistance systems that occur when addressing food poverty by reducing food surplus. In summary, as the awareness on food poverty and food surplus arises, incentives to food recovery and redistribution strengthen the role of food assistance actors, increasing their exposure to drivers of change, such as retailers’ standards for food surplus prevention. This paper contributes to the current academic debate on charitable food assistance, with insights for policy makers and other systems’ actors. (shrink)
Abortion is largely accepted even for reasons that do not have anything to do with the fetus' health. By showing that (1) both fetuses and newborns do not have the same moral status as actual persons, (2) the fact that both are potential persons is morally irrelevant and (3) adoption is not always in the best interest of actual people, the authors argue that what we call ‘after-birth abortion’ (killing a newborn) should be permissible in all the cases where abortion (...) is, including cases where the newborn is not disabled. (shrink)
This paper studies the notions of conceptual grounding and conceptual explanation, with an aim of clarifying the links between them. On the one hand, it analyses complex examples of these two notions that bring to the fore features that are easily overlooked otherwise. On the other hand, it provides a formal framework for modeling both conceptual grounding and conceptual explanation, based on the concept of proof. Inspiration and analogies are drawn with the recent research in metaphysics on the pair metaphysical (...) grounding–metaphysical explanation, and especially with the literature in philosophy of science on the pair causality-causal explanation. (shrink)
The aim of this paper is to investigate Nishida Kitarō’s way of philosophizing in the light of the concept of “transition” in order to deepen our understanding of both Nishida’s philosophy and our thinking about and in transitions, using the concept of “boundary” or “border” (Grenze) as a catalyst. For that purpose, we focus on Nishida’s essay “Place” (「場所」), passing through different parts of the text as if through successive gates on a path of transition between one place and the (...) next, until we reach the final place of “absolute nothingness.” Dwelling on this place, we turn our attention to its internal structure and try to depict it along the outlines of a boundary, following the movements taking place in Nishida’s essay. The second part proposes an interpretation of the place of nothingness as an interminable practice of boundary-crossing that doesn’t come to a halt in a final, all-encompassing place, but dynamically situates itself on countless intersecting planes. After a more or less abstract analysis of the concept of “boundary,” we will apply and concretize this approach by using the example of the skin. To this end, we expose five main features of the skin as boundary: permeability, enclosure, excessiveness, interstitiality and reciprocal self-formation. (shrink)
Why do many autistic people develop outstanding abilities in domains like drawing, music, computation, and reading? What aspects of autism predispose some to talent? This book explores the origin and prevalence of exceptional talent, its basis in the brain, the current theories, and the representation of talent and autism in biography and fiction.
Neofregeanism and structuralism are among the most promising recent approaches to the philosophy of mathematics. Yet both have serious costs. We develop a view, structuralist neologicism, which retains the central advantages of each while avoiding their more serious costs. The key to our approach is using arbitrary reference to explicate how mathematical terms, introduced by abstraction principles, refer. Focusing on numerical terms, this allows us to treat abstraction principles as implicit definitions determining all properties of the numbers, achieving a key (...) neofregean advantage, while preserving the key structuralist advantage, which objects play the number role does not matter. (shrink)
In recent years, researchers and practitioners have increasingly paid attention to food waste, which is seen as highly unethical given its negative environmental and societal implications. Waste recovery is dependent on the creation of connections along the supply chain, so that actors with goods at risk of becoming waste can transfer them to those who may be able to use them as inputs or for their own consumption. Such waste recovery is, however, often hampered by what we call ‘circularity holes’, (...) i.e., missing linkages between waste generators and potential receivers. A new type of actor, the digital platform organization, has recently taken on a brokerage function to bridge circularity holes, particularly in the food supply chain. Yet, extant literature has overlooked this novel type of brokerage that exploits digital technology for the transfer and recovery of discarded resources between supply chain actors. Our study investigates this actor, conceptualized as a ‘circularity broker’, and thus unites network research and circular supply chain research. Focusing on the food supply chain, we adopt an interpretive inductive theory-building approach to uncover how platform organizations foster the recovery of waste by bridging circularity holes. We identify and explicate six brokerage roles, i.e., connecting, informing, protecting, mobilizing, integrating and measuring, and discuss them in relation to extant literature, highlighting novelties compared to earlier studies. The final section reflects on contributions, implications, limitations and areas for further research. (shrink)
The goal of this article is to introduce a philosophical analysis of a widely neglected condition which affects between 3% and 18% of the population. People affected by this condition experience a lower level of wellbeing than the average population and are discriminated against in both their professional and their personal life. I will argue that this form of discrimination should be taken more seriously in philosophical debate and that social, legal and medical measures ought to be taken in order (...) to improve the quality of life of people affected by this condition. (shrink)
In this paper, we present a simple sequent calculus for the modal propositional logic S5. We prove that this sequent calculus is theoremwise equivalent to the Hilbert-style system S5, that it is contraction-free and cut-free, and finally that it is decidable. All results are proved in a purely syntactic way.
Hermann von Helmholtz’s geometrical papers have been typically deemed to provide an implicitly group-theoretical analysis of space, as articulated later by Felix Klein, Sophus Lie, and Henri Poincaré. However, there is less agreement as to what properties exactly in such a view would pertain to space, as opposed to abstract mathematical structures, on the one hand, and empirical contents, on the other. According to Moritz Schlick, the puzzle can be resolved only by clearly distinguishing the empirical qualities of spatial perception (...) from those describable in terms of axiomatic geometry. This paper offers a partial defense of the group-theoretical reading of Helmholtz along the lines of Ernst Cassirer in the fourth volume of The Problem of Knowledge of 1940. In order to avoid the problem raised by Schlick, Cassirer relied on a Kantian view of space not so much as an object of geometry, but as a precondition for the possibility of measurement. Although the concept of group does not provide a description of space, the modern way to articulate the concept of space in terms of transformation groups reveals something about the structure and the transformation of spatial concepts in mathematical and natural sciences. (shrink)
Grammar as a discipline devoted to the study of language was greatly advanced by the Alexandrian philologists, and especially by Aristarchus, as demonstrated by Stephanos Matthaios. In order to edit Homer and other literary authors, whose texts were often written in archaic Greek and presented many linguistic problems, the Alexandrians had to recognize linguistic grammatical categories and declensional patterns. In particular, to determine the correct orthography or accentuation of debated morphological forms they often employed analogy, which is generally defined as (...) the doctrine that grammatical forms must follow strict rules of declension. Modern scholars have often opposed the Alexandrian doctrine of analogy to the Pergamene doctrine of ‘anomaly’, which favoured spoken usage to determine debated forms. Detlev Fehling and David Blank, however, have shown that this strong opposition never really existed and it is mostly due to Varro. More correctly, ancient grammarians identified inflectional rules as well as forms derived from spoken usage or otherwise aberrant forms—however, respect for spoken usage in the latter case was not labelled ‘anomaly’, which was never a technical term of ancient grammar. Rather, and especially in the Roman period, grammarians used the term ‘pathology’ to account for and explain irregular forms. (shrink)
In this paper our aim is twofold: on the one hand, to present in a clear and faithful way two recent contributions to the logic of grounding, namely Correia, and Fine ; on the other hand, to argue that some of the formal principles describing the notion of grounding proposed by these logics need to be changed and improved.