Several authors have investigated the question of whether canonical logic-based accounts of belief revision, and especially the theory of AGM revision operators, are compatible with the dynamics of Bayesian conditioning. Here we show that Leitgeb's stability rule for acceptance, which has been offered as a possible solution to the Lottery paradox, allows to bridge AGM revision and Bayesian update: using the stability rule, we prove that AGM revision operators emerge from Bayesian conditioning by an application of the principle of maximum (...) entropy. In situations of information loss, or whenever the agent relies on a qualitative description of her information state - such as a plausibility ranking over hypotheses, or a belief set - the dynamics of AGM belief revision are compatible with Bayesian conditioning; indeed, through the maximum entropy principle, conditioning naturally generated AGM revision operators. This mitigates an impossibility theorem of Lin and Kelly for tracking Bayesian conditioning with AGM revision, and suggests an approach to the compatibility problem that highlights the information loss incurred by acceptance rules in passing from probabilistic to qualitative representations of beliefs. (shrink)
We investigate the modal logic of stepwise removal of objects, both for its intrinsic interest as a logic of quantification without replacement, and as a pilot study to better understand the complexity jumps between dynamic epistemic logics of model transformations and logics of freely chosen graph changes that get registered in a growing memory. After introducing this logic (MLSR) and its corresponding removal modality, we analyze its expressive power and prove a bisimulation characterization theorem. We then provide a complete Hillbert-style (...) axiomatization for the logic of stepwise removal in a hybrid language enriched with nominals and public announcement operators. Next, we show that model-checking for MLSR is PSPACE-complete, while its satisfiability problem is undecidable. Lastly, we consider an issue of fine-structure: the expressive power gained by adding the stepwise removal modality to fragments of first-order logic. (shrink)
Sixteen years ago, Prahalad and Hart introduced the possibility of both profitably serving the poor and alleviating poverty. This first iteration of the Bottom/Base of the Pyramid approach focused on selling to the poor. In 2008, after ethical criticisms leveled at it, the field moved to BoP 2.0, instead emphasizing business co-venturing. Since 2015, we have witnessed some calls for a new iteration, with the focus broadening to a more sustainable development approach to poverty alleviation. In this paper, we seek (...) to answer the question: How has the BoP approach evolved over the past 16 years, and has it delivered on its early promise? We conducted a systematic review of 276 papers published in journals in this period, utilizing a rigorous correspondence analysis method to map key trends, and then further examined the 22 empirical studies conducted on the BoP approach. Our results suggest that the field has evolved, passing through a number of trends and coming full circle—with our analysis pointing to more recent BoP literature emphasizing similar themes to those espoused in the initial BoP iteration, rather than reflecting the principles espoused in either BoP 2.0 or BoP 3.0. Our analysis also points to a lack of clear evidence that the BoP concept has delivered on its promise either to businesses or to BoP participants. (shrink)
Porter and Kramer :78–92, 2006; Harv Bus Rev 89, 62–77, 2011) introduced ‘shared value’ as a ‘new conception of capitalism,’ claiming it is a powerful driver of economic growth and reconciliation between business and society. The idea has generated strong interest in business and academia; however, its theoretical precepts have not been rigorously assessed. In this paper, we provide a systematic and thorough analysis of shared value, focusing on its ontological and epistemological properties. Our review highlights that ‘shared value’ has (...) spread into the language of multiple disciplines, but that its current conceptualization is vague, and it presents important discrepancies in the way it is defined and operationalized, such that it is more of a buzzword than a substantive concept. It also overlaps with many other concepts and lacks empirical grounding. We offer recommendations for defining and measuring the concept, take a step toward disentangling it from related concepts, and identify relevant theories and research methods that would facilitate extending the knowledge frontier on shared value. (shrink)
The proposal that probabilistic inference and unconscious hypothesis testing are central to information processing in the brain has been steadily gaining ground in cognitive neuroscience and associated fields. One popular version of this proposal is the new theoretical framework of predictive processing or prediction error minimization, which couples unconscious hypothesis testing with the idea of ‘active inference’ and claims to offer a unified account of perception and action. Here we will consider one outstanding issue that still looms large at the (...) core of the PEM framework: the lack of a clear criterion for distinguishing conscious states from unconscious ones. In order to fulfill the promise of becoming a unifying framework for describing and modeling cognition, PEM needs to be able to differentiate between conscious and unconscious mental states or processes. We will argue that one currently popular view, that the contents of conscious experience are determined by the ‘winning hypothesis’, falls short of fully accounting for conscious experience. It ignores the possibility that some states of a system can control that system’s behavior even though they are apparently not conscious. What follows from this is that the ‘winning hypothesis’ view does not provide a complete account of the difference between conscious and unconscious states in the probabilistic brain. We show how this problem for the received view can be resolved by augmenting PEM with Daniel Dennett’s multiple drafts model of consciousness. This move is warranted by the similar roles that attention and internal competition play in both the PEM framework and the multiple drafts model. (shrink)
This article discusses places and practices of young heterosexual Malaysian Muslims dating in non-private urban spaces. It is based on research conducted in Kuala Lumpur (KL) in two consecutive summers 2016 and 2017. Malaysian law (Khalwat law) does not allow for two unrelated people (where at least one of them is Muslim) of opposite sexes to be within ‘suspicious proximity’ of one another in public. This law significantly influences behaviors and activities in urban spaces in KL. In addition to the (...) legal framework, the beliefs of Malaysian muslims significantly influence the way they perceive space and how they behave in the city. The article discusses the empirical theme, beginning with the participants’ narratives of their engagement with the dominant sexual and gender order in non-private spaces of KL. Utilizing questionnaires, interviews and observations, this paper draws upon a qualitative research project and questions the analytical usefulness of the notion of public space (as a Western construct) in the context of an Islamic, postcolonial, tropical, global city. (shrink)
This paper reflects on how civic culture is shaped in consumer society. Contemporary society is now often described as a ‘consumer society’. In this society, identity and status are acquired and social inclusion or integration is considered to be achieved through participation in consumer activity.
Hobson & Friston (2014) outline a synthesis of Hobson's work on dreaming and consciousness with Friston’s work on the free energy principle and predictive coding. Whilst we are sympathetic with their claims about the function of dreaming and its relationship to consciousness, we argue that their endorsement of the Cartesian theatre metaphor is neither necessary nor desirable. Furthermore, if it were necessary then this endorsement would undermine their positive claims, as the Cartesian theatre metaphor is widely regarded as unsustainable. We (...) demonstrate this point and then develop an alternative formulation of their position that does not require the Cartesian theatre metaphor. Our positive goal is to clarify Hobson & Friston’s confusing usage of philosophical terminology, replacing it where possible with the more transparent language of the forward models framework. This will require some modifications to their account, which as it stands is philosophically and empirically unsustainable. (shrink)
Most ethical discussions about diet are focused on the justification of specific kinds of products rather than an individual assessment of the moral footprint of eating products of certain animal species. This way of thinking is represented in the typical division of four dietary attitudes. There are vegans, vegetarians, welfarists and ordinary meat -eaters. However, the common “all or nothing” discussions between meat -eaters, vegans and vegetarians bypass very important factors in assessing dietary habits. I argue that if we want (...) to discover a properly assessed moral footprint of animal products, we should take into consideration not only life quality of animals during farming or violation of their rights—as is typically done—but, most of all, their body weight, life time in farms and time efficiency in animal products acquisition. Without these factors, an assessment of animal products is much too simplified. If we assume some easily accepted premises, we can justify a thesis that, regardless of the treatment of animals during farming and slaughtering, for example, eating chicken can be 163 times morally worse than eating beef, drinking milk can be 58 times morally better than eating eggs, and eating some types of fish can be even 501 times worse than eating beef. In order to justify such a thesis there is no need to reform common morality by, for example, criticizing its speciesism. The thesis that some animal products are much worse than others can be justified on common moral grounds. (shrink)
My paper is a reaction to polemic of Tomasz Sieczkowski "Discrimination nonetheless. A reply to Krzysztof Saja” [ICF "Diametros" (36) 2013] that he wrote against my paper "Discrimination against same-sex couples" [ICF “Diametros" (34) 2012]. The purpose of the paper is to refute Sieczkowski’s objections that rely on wrong interpretation of the structure of my main argument. I will describe the proper course of the reasoning that I have expressed in the first article and undermine the Sieczkowski’s proposal to (...) justify gay marriages by referring to values such as dignity, freedom and equality. (shrink)
The aim of this paper is to show that the interpretivist account of propositional attitudes fails even at the most plausible reading that treats this theory as a version of the deflationary approach to existence coupled with a metaphysical claim about the judgement-dependence of propositional attitudes. It will be argued that adopting a deflationary reading of interpretivism allows this theory to avoid the common charge of fictionalism, according to which interpretivists cannot maintain realism about attitudes as their theory becomes a (...) covert form of mental fictionalism. However, as will be shown, the deflationary version of interpretivism faces a fatal dilemma: either it becomes indistinguishable from generic deflationism about the mental, or it must embrace the metaphysical thesis of judgement-dependence of propositional attitudes. The latter option leads to unacceptable epistemological consequences, as it cannot accommodate intuitions about possibility of error in attribution of attitudes. Thus, it turns out that even a subtle version of interpretivism is not a viable theory of intentional states. (shrink)
In this paper we argue that inferentialist approach to meaning does not, by itself, show that meaning is normative in a prescriptive sense, and that the constitutive rules argument is especially troubling for this position. To show that, we present the proto-inferentialist theory developed by Ajdukiewicz and claim that despite the differences between his theory and contemporary inferentialism rules of language in both theories function more like classificatory devices than prescriptions. Inferentialists can respond by claiming that in their theory meaning (...) is essentially social and hence normative, but we claim that then semantic normativity becomes derivative of social normativity. (shrink)
The Flame of Eternity provides a reexamination and new interpretation of Nietzsche's philosophy and the central role that the concepts of eternity and time, as he understood them, played in it. According to Krzysztof Michalski, Nietzsche's reflections on human life are inextricably linked to time, which in turn cannot be conceived of without eternity. Eternity is a measure of time, but also, Michalski argues, something Nietzsche viewed first and foremost as a physiological concept having to do with the body. (...) The body ages and decays, involving us in a confrontation with our eventual death. It is in relation to this brute fact that we come to understand eternity and the finitude of time. Nietzsche argues that humanity has long regarded the impermanence of our life as an illness in need of curing. It is this "pathology" that Nietzsche called nihilism. Arguing that this insight lies at the core of Nietzsche's philosophy as a whole, Michalski seeks to explain and reinterpret Nietzsche's thought in light of it. Michalski maintains that many of Nietzsche's main ideas--including his views on love, morality (beyond good and evil), the will to power, overcoming, the suprahuman (or the overman, as it is infamously referred to), the Death of God, and the myth of the eternal return--take on new meaning and significance when viewed through the prism of eternity. (shrink)
Based on a small research project conducted in Kuala Lumpur (KL) in July - August 2017, the paper discusses places and practices of young heterosexual Malaysian Muslims dating in KL. In Malaysia, the law (Khalwat law) does not allow for two unrelated people (where at least one of them is Muslim) of opposite sexes to be within ‘suspicious proximity’ of one another in public. This law significantly influences behaviours and activities in urban spaces in KL. However, apart from the legal (...) framework, the faith of urban users seems to influence significantly the way they perceive space and how they behave in the city. The paper questions the analytical usefulness of the notion of public space (as the Western construct) in an attempt to formulate new intellectual coordinates to discuss urban space in a context of the Islamic, post-colonial, tropical, and global city. The ultimate aim of this paper is to start discussing how religious imagination and narratives could lead to formulating a new typology of urban spaces. (shrink)
The article explores the concept of infodemics during the COVID-19 pandemic, focusing on the propagation of false or inaccurate information proliferating worldwide throughout the SARS-CoV-2 health crisis. We provide an overview of disinformation, misinformation and malinformation and discuss the notion of “fake news”, and highlight the threats these phenomena bear for health policies and national and international security. We discuss the mis-/disinformation as a significant challenge to the public health, intelligence, and policymaking communities and highlight the necessity to design measures (...) enabling the prevention, interdiction, and mitigation of such threats. We then present an overview of selected opportunities for applying technology to study and combat disinformation, outlining several approaches currently being used to understand, describe, and model the phenomena of misinformation and disinformation. We focus specifically on complex networks, machine learning, data- and text-mining methods in misinformation detection, sentiment analysis, and agent-based models of misinformation spreading and the detection of misinformation sources in the network. We conclude with the set of recommendations supporting the World Health Organization’s initiative on infodemiology. We support the implementation of integrated preventive procedures and internationalization of infodemic management. We also endorse the application of the cross-disciplinary methodology of Crime Science discipline, supplemented by Big Data analysis and related information technologies to prevent, disrupt, and detect mis- and disinformation efficiently. (shrink)
The discussion of the nature of consciousness seems to have stalled, with the “hard problem of consciousness” in its center, well-defined camps of realists and eliminativists at two opposing poles, and little to none room for agreement between. Recent attempts to move this debate forward by shifting them to a meta-level have heavily relied on the notion of “intuition”, understood in a rather liberal way. Against this backdrop, the goal of this paper is twofold. First, we want to highlight how (...) the ontological and epistemological status of intuitions restricts the arguments in the debate on consciousness that rely on them. Second, we want to demonstrate how the deadlock in those debates could be resolved through a study of a particular, “positive” kind of intuitions. We call this approach “The Canberrish Plan for Consciousness” as it adopts elements of the methodological “Canberra Plan”. (shrink)
In my paper I discuss the argument that the absence of the legal possibility to contract same-sex marriages is discriminatory. I argue that there is no analogy between the legal situation of same-sex couples and African-Americans, women or disabled persons in the nineteenth century. There are important natural differences between same-sex and different-sex couples that are good reasons for the legal disparities between them. The probability of having and raising children is one of them. Therefore, demanding that same-sex couples have (...) rights similar to those that married couples currently have in Poland and justifying that claim by alleged discrimination is neither correct nor fair. (shrink)
The cross-cultural differences in epistemic intuitions reported by Weinberg, Nichols and Stich (2001; hereafter: WNS) laid the ground for the negative program of experimental philosophy. However, most of WNS’s findings were not corroborated in further studies. The exception here is the study concerning purported differences between Westerners and Indians in knowledge ascriptions concerning the Zebra Case, which was never properly replicated. Our study replicates the above-mentioned experiment on a considerably larger sample of Westerners (n = 211) and Indians (n = (...) 204). The analysis found a significant difference between the ethnic groups in question in the predicted direction: Indians were more likely to attribute knowledge in the Zebra Case than Westerners. In this paper, we offer an explanation of our result that takes into account the fact that replications of WNS’s other experiments did not find any cross-cultural differences. We argue that the Zebra Case is unique among the vignettes tested by WNS since it should not be regarded as a Gettier case but rather as a scenario exhibiting skeptical pressure concerning the reliability of sense-perception. We argue that skepticism towards perception as a means of gaining knowledge is a trope that is deeply rooted in Western epistemology but is very much absent from Classical Indian philosophical inquiry. This line of reasoning is based on a thorough examination of the skeptical scenarios discussed by philosophers of the Indian Nyaya tradition and their adversaries. (shrink)
Few years ago Krzysztof Czerniawski has published a book „Three Versions of Epistemic Theory of Truth: Dummett, Putnam, Wright”. It drew my attention, for there are many works which are concerned with the philosophical problem of truth, but only few comparative studies between different ideas concerning theory of truth. Author focuses on so-called Epistemic Theory of Truth, which assumes—according to the characteristics of Wolfgang Künne—that being true depends to some extent on our judgement. It is clear that there were (...) far more philosophers, who understood truth in similar way, e. g. Pierce, Brentano, Neurath, however, Czerniawski concentrates on the most recent history of Epistemic Theory of Truth. He also takes no account of philosophy of Habermas and Gadamer, whose ideas on truth can also be classified as „epistemic”, for they are built out of analytical tradition of philosophising. Thus, he chooses Michael Dummett, Hilary Putnam and Crispin Wright—three analytical philosophers who significantly contributed to the development of „epistemic” approaches to the problem of truth. (shrink)
The aim of this paper is to argue against the claim that the term “belief”, as it functions in philosophical psychology, has natural-kind term semantics; this thesis is central to the famous Lycan–Stich argument against eliminative materialism. I will argue that the current debate concerning the discrepancy between the professed opinions and actions, especially the debate concerning the idea of aliefs, shows that the concept of belief is plastic and amenable to conceptual engineering. The plasticity and amenability to conceptual engineering (...) of the concept of belief give us, in turn, a reason to doubt that “belief” functions in a way that is presupposed in the Lycan–Stich argument. Finally, I point to an alternative to both eliminativism and the natural kind view, namely the idea that we should treat belief as a human kind. (shrink)
The primary goal of this monograph is to justify the possibility of building a hybrid theory of normative ethics which can combine ethical consequentialism, deontology and virtue ethics. The aim of the book is to demonstrate the possibility of constructing a synthetic theory from ethical traditions that are generally considered to be contradictory. In addition, I propose an outline of an original theory which tries to carry out such a synthesis. I call it Institutional Function Consequentialism. The justification for a (...) hybrid theory of normative ethics requires the resolution of certain meta-ethical issues. I discuss them in part 1.1. After a brief overview of my research aims I consider the answer to the following questions: “What kind of meta-ethical beliefs must we assume for normative ethics?” and "Why should we build ethical theories?". I also discuss the various forms of ethical scepticism, which should be rejected if we develop a hybrid theory of normative ethics. I also describe possible methods of justifying any ethical theory. At the end of part 1.1 I explain that the best method for justifying any ethical theory is the method of wide reflective equilibrium. In parts 1.2, 1.3 and 1.4 I analyse three main types of normative ethics which are considered to be competitive: consequentialism, deontology, and virtue ethics. Part 1.2 is about consequentialism and its difference between utilitarianism and teleological theories. I also present the main reasons for the popularity of consequentialism. In part 1.3 I define deontology, present its variations and show important benefits that motivate its continuous growth. In part 1.4 I discuss virtue ethics, which is often referred as the viable alternative to consequentialism and deontology. If we want to build any new ethical theory, first we should describe the problems of the old ones. For this reason, in chapter 2 I describe the most important arguments against ethical consequentialism, and in chapter 3 I explain the difficulties of deontology. Critical remarks on virtue ethics are considered in part 1.4. The last two chapters directly concern the main aim of the monograph. Because in chapter 3 I show that deontology suffers from a number of problems such as the paradox of deontology, in chapters 4 and 5 I present only consequentialist versions of hybrid theories. These versions try to avoid the common pleas of utilitarianism and combine the advantages of opponents to utilitarianism. I answer the question (4.1) "What is hybrid consequentialism?" and briefly describe (4.2) the most important examples of hybrid theories by others. I discuss the prerogatives without restrictions idea by Samuel Scheffler, rule consequentialism by Brad Hooker, Kantian consequentialism by Richard M. Hare, David Cummiskey and Derek Parfit and consequentializing procedure proposed by Douglas W. Portmore. In the last chapter I present a new hybrid version of consequentialism that I call Institutional Function Consequentialism. I start from the thesis (5.1) that traditional meta-ethics of analytic philosophy incorrectly defines the main field of its research. Rather than focusing on the philosophy of language and metaphysics it should try to answer one question: “What should morality and ethics be for?”. Therefore I introduce an original methodology for meta-ethics that I call the Functional Model of Analysis (FMA). It is a meta-ethical framework in which the fundamental questions concern the practical functions of normative domains such as morality and ethics and the most rational way of achieving them. FMA is meta-ethical project that (a) explains the persistence of fundamental ethical disagreement among experts, (b) sets the background for explaining or justifying the correct structure of the ethical theory, (c) gives the possibility to create a hybrid theory of normative ethics. In part 5.2 I formulate a generalized argument against various popular theories of normative ethics which states that most of the popular theories are too narrow and one-sided, causing permanent dispute between them. There are many distinct practical functions of morality that are usually based on different types of normative theories such as Aristotelian ethics, utilitarianism, contractarianism and contractualism. In order to remove these conflicts, the correct ethical theory should try to take into account all of these relevant features of moral life. Unfortunately, most ethical theories are mono-functional. This means that their supporters consciously or unconsciously (1) recognize or accept only one genuine practical function, (2) think that this one practical function overrides all other practical functions or (3) try to reduce all practical functions to the one that is chosen. In part 5.2 I present a description and justification of a new normative account that I call Hybrid Function Consequentialism. The formal scheme of this consequentialism can be summarized in the following way: we should act according to some important focal points P1…Pn with contents C1…Cn that are selected on the basis of considerations about which kind of P with C will bring the best realization of the best mix of normative functions F1...Fn. In the next part (5.2.2) I present the new way of “consequentializing” particular theories of normative ethics by applying the scheme of Hybrid Function Consequentialism. As an example of the use of this procedure, I formulate perfectionist virtue consequentialism, contractarian institution consequentialism, contractualist rule consequentialism and consequentialism of salvation. In part 5.3 I formulate an original theoretical concept that I call Institutional Function Consequentialism (IFC). It is a hybrid theory that is based on the previously described Functional Model of Analysis and a reflection on the role of institutions in modern developed societies. IFC claims that we should always act according to some rules, virtues, motives and intentions that constitute the homeostasis of normative institutions whose internalization by the overwhelming majority of each new generation has maximum expected value in terms of the best realization of the equilibrium of the most important practical functions of normative domains. Responsibility and sanctions related to wrongful actions should be dependent on the particular institution whose standards have been violated. IFC has several unique features which are described and justified in part 5.3. It is a form of hybrid consequentialism – it mixes features of consequentialism, deontology and virtue ethics. It does this on three different levels: justification, structure and content. The main focal point of the theory is not rules or virtues, but normative institutions. Values to be optimized in the framework of consequentialism are not "happiness", "well-being" or "preferences", but the fulfilment of normative functions. IFC is not limited to the field of ethics only, but it is a meta-normative account of justification for an entire social order which is comprised of a number of different institutions. It also assumes a specific, circular relationship between practical ethics (e.g. research on such important issues as civil disobedience, fair distribution of wealth, philosophy of punishment or autonomy of conscience), and the content of the general rules that may constitute the optimal homeostasis of institutions. It rejects the simplistic belief that moral thinking is based solely on deductive implications from general ethical principles or meta-ethical beliefs. (shrink)
The aim of this paper is to show that expressivism about attitudes is not a tenable position. Although this claim has been often made in the literature, traditional arguments against attitudinal expressivism assumed a dated form of expressivism. In order to show that ascriptions of attitudes cannot be seen as expressive in either of them, the paper discusses two more recent versions of expressivism: quasi-realism and inferentialist expressivism. Quasi-realism escapes traditional arguments against attitudinal expressivism because it allows expressive statements to (...) be true in a deflationary sense. Nonetheless, it is implausible to adopt quasi-realist expressivism with regard to attitudes because this position cannot avoid substantial psychological commitments, despite Toppinen's recent arguments to the contrary. This conclusion also pertains to quasi-realist hybrid expressivist conceptions. The recently developed conception of inferentialist expressivism arguably creates the most accommodating theory for attitudinal expressivism as it does not make any substantial psychological commitments. Still, the criteria of bifurcation assumed in this framework does not support the claim that attitudes play an expressive role. (shrink)
Emerging from the disruption of the First World War, surrealism confronted the resulting ‘crisis of consciousness’ in a way that was arguably more profound than any other cultural movement of the time._ _The past few decades have seen an expansion of interest in surrealist writers, whose contribution to the history of ideas in the twentieth-century is only now being recognised._ Surrealism: Key Concepts_ is the first book in English to present an overview of surrealism through the central ideas motivating the (...) popular movement. An international team of contributors provide an accessible examination of the key concepts, emphasising their relevance to current debates in social and cultural theory. This book will be an invaluable guide for students studying a range of disciplines, including Philosophy, Anthropology, Sociology and Cultural Studies, and anyone who wishes to engage critically with surrealism for the first time. _Contributors:_ Dawn Ades, Joyce Cheng, Jonathan P. Eburne, Krzysztof Fijalkowski, Guy Girard, Raihan Kadri, Michael Löwy, Jean-Michel Rabaté, Michael Richardson, Donna Roberts, Bertrand Schmitt, Georges Sebbag, Raymond Spiteri, and Michael Stone-Richards. (shrink)
The papers in this special issue make important contributions to a longstanding debate about how we should conceive of and explain mental phenomena. In other words, they make a case about the best philosophical paradigm for cognitive science. The two main competing approaches, hotly debated for several decades, are representationalism and enactivism. However, recent developments in disciplines such as machine learning and computational neuroscience have fostered a proliferation of intermediate approaches, leading to the emergence of completely new positions, in particular (...) the Predictive Processing approach. Here, we will consider the different approaches discussed in this volume. (shrink)
Author: Moraczewski Krzysztof Title: REFLECTIONS ON THEORETICAL GROUND OF THE HISTORY OF THE CULTURE (Refleksje o teoretycznych podstawach historii kultury) Source: Filo-Sofija year: 2011, vol:.12, number: 2011/1, pages: 239-262 Keywords: HISTORY OF THE CULTURE, HISTORY OF MENTALITY, MENTAL REALITY, FERNAND BRAUDEL, JERZY KMITA Discipline: PHILOSOPHY Language: POLISH Document type: ARTICLE Publication order reference (Primary author’s office address): E-mail: www:During last 60 years, there was a lot of discussion on the profits that history may gain by applying certain results and (...) theoretical approaches of cultural anthropology and philosophy of culture. Polemists involved were significantly less concerned with the opposite way: the consequences of new models of history for theory of culture. Trying to follow this way I start with classic propositions by Fernand Braudel and his ‘unfaithful’ followers in France, usually connected with the idea of ‘history of mentality’. My main concern is – minding all the theoretical and methodological problems that the term ‘mentality’ have caused and continues to cause – to confront their ideas with concept of culture as a ‘mental reality’, represented by Jerzy Kmita’s social-regulatory conception. The conclusions are not only to substitute the term ‘mentality’ with the theoretical concept of culture (the idea already formulated by Robert Darnton) but firstly to broaden and reformulate some Kmita’s ideas, allowing them to cover the field of research typical for the traditional history of mentality. (shrink)
The present paper deals with the problem of the digital-culture-public-philosophy as a possible response of those philosophers who see the need to face the challenges of the Internet and the visual culture that constitutes an important part of the Internet cultural space. It claims that this type of philosophy would have to, among many other things, modify and broaden philosophers’ traditional mode of communication. It would have to expand its textual, or mainly text-related, communication mode into the aesthetic and visual (...) communication mode. More precisely, philosophers would have to learn how to aestheticize and visualize their ethical narratives by using some digital tools – YouTube clips for example. (shrink)
The article presents a problem of proper hedging strategy in expected utility model when forward contracts and options strategies are available. We consider a case of hedging when an investor formulates his own expectation on future price of underlying asset. In this paper we propose the way to measure effectiveness of hedging strategy, based on optimal forward hedge ratio. All results are derived assuming a constant absolute risk aversion utility function and a Black-Scholes framework.
The "space" of Lascar strong types, on some sort and relative to a given complete theory T, is in general not a compact Hausdorff topological space. We have at least three aims in this paper. The first is to show that spaces of Lascar strong types, as well as other related spaces and objects such as the Lascar group Gal L of T, have well-defined Borel cardinalities. The second is to compute the Borel cardinalities of the known examples as well (...) as of some new examples that we give. The third is to explore notions of definable map, embedding, and isomorphism, between these and related quotient objects. We also make some conjectures, the main one being roughly "smooth if and only if trivial". The possibility of a descriptive set-theoretic account of the complexity of spaces of Lascar strong types was touched on in the paper [E. Casanovas, D. Lascar, A. Pillay and M. Ziegler, Galois groups of first order theories, J. Math. Logic1 305–319], where the first example of a "non-G-compact theory" was given. The motivation for writing this paper is partly the discovery of new examples via definable groups, in [A. Conversano and A. Pillay, Connected components of definable groups and o-minimality I, Adv. Math.231 605–623; Connected components of definable groups and o-minimality II, to appear in Ann. Pure Appl. Logic] and the generalizations in [J. Gismatullin and K. Krupiński, On model-theoretic connected components in some group extensions, preprint, arXiv:1201.5221v1]. (shrink)
In this paper, I present main theses of Aquinas Way to God: The Proof in the De Ente et Essentia by Gaven Kerr. The book in question is a contemporary interpretation and defence of Thomas Aquinas’s argument for the existence of God, based on the real distinction between the essence of the thing and its act of being. I stress the fact that Kerr underlines the metaphysical character of Thomas’s argument and the role of participation in Aquinas’s understanding of the (...) act of being. In the last part of the article, I discuss Kerr’s interpretation of Aquinas’s argument for the real distinction between essence and an act of being, as well as Kerr’s own argument. These arguments are of particular importance since they provide metaphysical presuppositions for the argument for God’s existence considered in Kerr’s book. As for the first argument, I argue that the first part of Aquinas’s argumentation pertains to the real order rather than conceptual. Concerning the second argument, I attempt to highlight the difficulties of Kerr’s understanding of Thomist esse as a principle of the existence of a thing. (shrink)
Antyrealizm etyczny reprezentowany jest obecnie przez około 30% filozofów analitycznych. Podzielają oni przekonanie, że nie istnieją moralne własności, fakty czy wartości. Przez długi okres rozwijany był on zwłaszcza przez akognitywistów. Jednak od czasu publikacji książki J. Mackiego Ethics. Inventingright and wrong (1977) antyrealistyczny sceptycyzm został zradykalizowany, przybierając także formę teorii globalnego błędu. Przyjęcie powyższego przekonania prowadzi do trzech strategii postępowania: 1. fikcjonalizmu asertorycznego (J. Mackie), 2. fikcjonalizmu nieasertorycznego (R. Joyce) oraz 3. eliminatywizmu (I. Hinckfuss i R. Garner). W artykule, przyjmując (...) antyrealistyczny punkt wyjścia przedstawicieli teorii globalnego błędu, analizuję powody, dla których warto wyeliminować etykę, i powody, dla których moglibyśmy mieć z tym problem. Następnie przedstawiam argumenty, ze względu na które pomysł eliminacji etyki pośrednio wspiera zmianę myślenia o realizmie etycznym z paradygmatu ontologicznego na racjonalistyczny. W związku z tym prezentuję koncepcję rewizyjnego realizmu racji oraz dowodzę, że nie poddaje się ona argumentom zwolenników eliminacji etyki. (shrink)
We develop a basic theory of rosy groups and we study groups of small Uþ-rank satisfying NIP and having finitely satisfiable generics: Uþ-rank 1 implies that the group is abelian-by-finite, Uþ-rank 2 implies that the group is solvable-by-finite, Uþ-rank 2, and not being nilpotent-by-finite implies the existence of an interpretable algebraically closed field.
Proceedings of a conference held June 26-30, 2007 at Opole University, Poland. -/- This volume explores the three normative sciences that Peirce distinguished (aesthetics, ethics, and logic) and their relation to phenomenology and metaphysics. The essays approach this topic from a variety of angles, ranging from questions concerning the normativity of logic to an application of Peirce’s semiotics to John Coltrane’s “A Love Supreme.”.