We study model theory of random variables using finitary integral logic. We prove definability of some probability concepts such as having F as distribution function, independence and martingale property. We then deduce Kolmogorov's existence theorem from the compactness theorem.
Prestack seismic attributes are efficient tools for hydrocarbon exploration and pore fluid detection with the help of various techniques, such as amplitude variation with offset analysis. Such studies focus mainly on siliciclastics rather than carbonates because detection of fluid effects in carbonate rocks can be masked by their complex pore structure and heterogeneity. Current fluid detection methods from seismic attributes usually rely on a linear background model for P- and S-wave velocities of the water-saturated rocks, and any deviation from this (...) trend is assigned to possible pore fluid changes. This means that the false effect of fluids can be detected in carbonate rocks if inappropriate fluid detection attributes are used. This is mainly due to the varying pore structure in carbonates, which can make their background model mainly nonlinear. I observed that this nonlinearity in the carbonates background model becomes more linear by using P-velocity squared versus the product of the P- and S-velocities crossplot instead of P-velocity versus S-velocity crossplot. Furthermore, I used this proposed crossplot to define a more appropriate background model for my carbonate sequence containing some percentages of gas. I derived a new seismic fluid attribute based on the proposed background model, and I compared the results with various other fluid factors. My results highlight fluid changes more brightly and consistently than existing alternatives for carbonate environments. (shrink)
Human trafficking for organ removal (HTOR) should not be reduced to a problem of supply and demand of organs for transplantation, a problem of organized crime and criminal justice, or a problem of voiceless, abandoned victims. Rather, HTOR is at once an egregious human rights abuse and a form of human trafficking. As such, it demands a human-rights based approach in analysis and response to this problem, placing the victim at the center of initiatives to combat this phenomenon. Such an (...) approach requires us to consider how various measures impact or disregard victims/potential victims of HTOR and gives us tools to better advocate their interests, rights and freedoms. (shrink)
Symplectic reduction is a formal process through which degeneracy within the mathematical representations of physical systems displaying gauge symmetry can be controlled via the construction of a reduced phase space. Typically such reduced spaces provide us with a formalism for representing both instantaneous states and evolution uniquely and for this reason can be justifiably afforded the status of fun- damental dynamical arena - the otiose structure having been eliminated from the original phase space. Essential to the application of symplectic reduction (...) is the precept that the first class constraints are the relevant gauge generators. This prescription becomes highly problematic for reparameterization invariant theories within which the Hamiltonian itself is a constraint; not least because it would seem to render prima facie distinct stages of a history physically identical and observable functions changeless. Here we will consider this problem of time within non-relativistic me- chanical theory with a view to both more fully understanding the temporal struc- ture of these timeless theories and better appreciating the corresponding issues in relativistic mechanics. For the case of nonrelativistic reparameterization invariant theory application of symplectic reduction will be demonstrated to be both unnec- essary; since the degeneracy involved is benign; and inappropriate; since it leads to a trivial theory. With this anti-reductive position established we will then examine two rival methodologies for consistently representing change and observable func- tions within the original phase space before evaluating the relevant philosophical implications. We will conclude with a preview of the case against symplectic re- duction being applied to canonical general relativity. (shrink)
An intelligent machine surpassing human intelligence across a wide set of skills has been proposed as a possible existential catastrophe. Among those concerned about existential risk related to artificial intelligence, it is common to assume that AI will not only be very intelligent, but also be a general agent. This article explores the characteristics of machine agency, and what it would mean for a machine to become a general agent. In particular, it does so by articulating some important differences between (...) belief and desire in the context of machine agency. One such difference is that while an agent can by itself acquire new beliefs through learning, desires need to be derived from preexisting desires or acquired with the help of an external influence. Such influence could be a human programmer or natural selection. We argue that to become a general agent, a machine needs productive desires, or desires that can direct behavior across multiple contexts. However, productive desires cannot sui generis be derived from non-productive desires. Thus, even though general agency in AI could in principle be created by human agents, general agency cannot be spontaneously produced by a non-general AI agent through an endogenous process. In conclusion, we argue that a common AI scenario, where general agency suddenly emerges in a non-general agent AI, such as DeepMind’s superintelligent board game AI AlphaZero, is not plausible. (shrink)
We claim that, as it stands, the Deutsch–Wallace–Everett approach to quantum theory is conceptually incoherent. This charge is based upon the approach’s reliance upon decoherence arguments that conflict with its own fundamental precepts regarding probabilistic reasoning in two respects. This conceptual conflict obtains even if the decoherence arguments deployed are aimed merely towards the establishment of certain ‘emergent’ or ‘robust’ structures within the wave function: To be relevant to physical science notions such as robustness must be empirically grounded, and, on (...) our analysis, this grounding can only plausibly be done in precisely the probabilistic terms that lead to conceptual conflict. Thus, the incoherence problems presented necessitate either the provision of a new, non-probabilistic empirical grounding for the notions of robustness and emergence in the context of decoherence, or the abandonment of the Deutsch–Wallace–Everett programme for quantum theory. (shrink)
In recent decades, non-representational approaches to mental phenomena and cognition have been gaining traction in cognitive science and philosophy of mind. In these alternative approach, mental representations either lose their central status or, in its most radical form, are banned completely. While there is growing agreement that non-representational accounts may succeed in explaining some cognitive capacities, there is widespread skepticism about the possibility of giving non-representational accounts of cognitive capacities such as memory, imagination or abstract thought. In this paper, I (...) will critically examine the view that there are fundamental limitations to non-representational explanations of cognition. Rather than challenging these arguments on general grounds, I will examine a set of human cognitive capacities that are generally thought to fall outside the scope of non-representational accounts, i.e. numerical cognition. After criticizing standard representational accounts of numerical cognition for their lack of explanatory power, I will argue that a non-representational approach that is inspired by radical enactivism offers the best hope for developing a genuine naturalistic explanatory account for these cognitive capacities. (shrink)
BackgroundObtaining informed consent for participation in genomic research in low-income settings presents specific ethical issues requiring attention. These include the challenges that arise when providing information about unfamiliar and technical research methods, the implications of complicated infrastructure and data sharing requirements, and the potential consequences of future research with samples and data. This study investigated researchers’ and participants’ parents’ experiences of a consent process and understandings of a genome-wide association study of malaria involving children aged five and under in Mali. (...) It aimed to inform best practices in recruiting participants into genomic research.MethodsA qualitative rapid ethical assessment was undertaken. Fifty-five semi-structured interviews were conducted with the parents of research participants. An additional nine semi-structured interviews were conducted with senior research scientists, research assistants and with a member of an ethics committee. A focus group with five parents of research participants and direct observations of four consent processes were also conducted. French and translated English transcripts were descriptively and thematically coded using OpenCode software.ResultsParticipants’ parents in the MalariaGEN study had differing understandings of the causes of malaria, the rationale for collecting blood samples, the purposes of the study and the kinds of information the study would generate. Genomic aspects of the research, including the gene/environment interaction underlying susceptibility or resistance to severe malaria, proved particularly challenging to explain and understand.ConclusionsThis study identifies a number of areas to be addressed in the design of consent processes for genomic research, some of which require careful ethical analysis. These include determining how much information should be provided about differing aspects of the research and how best to promote understandings of genomic research. We conclude that it is important to build capacity in the design and conduct of effective and appropriate consent processes for genomic research in low and middle-income settings. Additionally, consideration should be given to the role of review committees and community consultation activities in protecting the interests of participants in genomic research. (shrink)
Disease prioritarianism is a principle that is often implicitly or explicitly employed in the realm of healthcare prioritization. This principle states that the healthcare system ought to prioritize the treatment of disease before any other problem. This article argues that disease prioritarianism ought to be rejected. Instead, we should adopt ‘the problem-oriented heuristic’ when making prioritizations in the healthcare system. According to this idea, we ought to focus on specific problems and whether or not it is possible and efficient to (...) address them with medical means. This has radical implications for the extension of the healthcare system. First, getting rid of the binary disease/no-disease dichotomy implicit in disease prioritarianism would improve the ability of the healthcare system to address chronic conditions and disabilities that often defy easy classification. Second, the problem-oriented heuristic could empower medical practitioners to address social problems without the need to pathologize these conditions. Third, the problem-oriented heuristic clearly states that what we choose to treat is a normative consideration. Under this assumption, we can engage in a discussion on de-medicalization without distorting preconceptions. Fourth, this pragmatic and de-compartmentalizing approach should allow us to reconsider the term ‘efficiency’. (shrink)
Econophysics is a new and exciting cross-disciplinary research field that applies models and modelling techniques from statistical physics to economic systems. It is not, however, without its critics: prominent figures in more mainstream economic theory have criticized some elements of the methodology of econophysics. One of the main lines of criticism concerns the nature of the modelling assumptions and idealizations involved, and a particular target are ‘kinetic exchange’ approaches used to model the emergence of inequality within the distribution of individual (...) monetary income. This article will consider such models in detail, and assess the warrant of the criticisms drawing upon the philosophical literature on modelling and idealization. Our aim is to provide the first steps towards informed mediation of this important and interesting interdisciplinary debate, and our hope is to offer guidance with regard to both the practice of modelling inequality, and the inequality of modelling practice. _1_ Introduction _1.1_ Econophysics and its discontents _1.2_ Against burglar economics _2_ Modelling Inequality _2.1_ Mainstream economic models for income distribution _2.2_ Econophysics models for income distribution _3_ Idealizations in Kinetic Exchange Models _3.1_ Binary interactions _3.2_ Conservation principles _3.3_ Exchange dynamics _ 4 _ Fat Tails and Savings _ 5 _ Evaluation. (shrink)
Environmental risk assessment is often affected by severe uncertainty. The frequently invoked precautionary principle helps to guide risk assessment and decision-making in the face of scientific uncertainty. In many contexts, however, uncertainties play a role not only in the application of scientific models but also in their development. Building on recent literature in the philosophy of science, this paper argues that precaution should be exercised at the stage when tools for risk assessment are developed as well as when they are (...) used to inform decision-making. The relevance and consequences of this claim are discussed in the context of the threshold of the toxicological concern approach in food toxicology. I conclude that the approach does not meet the standards of an epistemic version of the precautionary principle. (shrink)
On one popular view, the general covariance of gravity implies that change is relational in a strong sense, such that all it is for a physical degree of freedom to change is for it to vary with regard to a second physical degree of freedom. At a quantum level, this view of change as relative variation leads to a fundamentally timeless formalism for quantum gravity. Here, we will show how one may avoid this acute ‘problem of time’. Under our view, (...) duration is still regarded as relative, but temporal succession is taken to be absolute. Following our approach, which is presented in more formal terms in, it is possible to conceive of a genuinely dynamical theory of quantum gravity within which time, in a substantive sense, remains. 1 Introduction1.1 The problem of time1.2 Our solution2 Understanding Symmetry2.1 Mechanics and representation2.2 Freedom by degrees2.3 Voluntary redundancy3 Understanding Time3.1 Change and order3.2 Quantization and succession4 Time and Gravitation4.1 The two faces of classical gravity4.2 Retaining succession in quantum gravity5 Discussion5.1 Related arguments5.2 Concluding remarks. (shrink)
The canonical formalism of general relativity affords a particularly interesting characterisation of the infamous hole argument. It also provides a natural formalism in which to relate the hole argument to the problem of time in classical and quantum gravity. In this paper we examine the connection between these two much discussed problems in the foundations of spacetime theory along two interrelated lines. First, from a formal perspective, we consider the extent to which the two problems can and cannot be precisely (...) and distinctly characterised. Second, from a philosophical perspective, we consider the implications of various responses to the problems, with a particular focus upon the viability of a `deflationary' attitude to the relationalist/substantivalist debate regarding the ontology of spacetime. Conceptual and formal inadequacies within the representative language of canonical gravity will be shown to be at the heart of both the canonical hole argument and the problem of time. Interesting and fruitful work at the interface of physics and philosophy relates to the challenge of resolving such inadequacies. (shrink)
The purpose of this study is to propose the structural outline and conceptual framework of a Ricœurian translation theory. Following a discussion on the ambiguities around situating Ricœur in translation theory, three major interlinked components of the theory are explored. First, the metaphysics of meaning and translation is established based on Ricœur’s hermeneutics of infinitude. Then, the language-processing component is constructed through an incorporation of Ricœur’s narrative theory. Finally, the ethics and politics of translation, particularly in globalization, are founded based (...) on Ricœur’s “age of hermeneutics theory.”. (shrink)
Starting from a generalized Hamilton-Jacobi formalism, we develop a new framework for constructing observables and their evolution in theories invariant under global time reparametrizations. Our proposal relaxes the usual Dirac prescription for the observables of a totally constrained system and allows one to recover the influential partial and complete observables approach in a particular limit. Difficulties such as the non-unitary evolution of the complete observables in terms of certain partial observables are explained as a breakdown of this limit. Identification of (...) our observables relies upon a physical distinction between gauge symmetries that exist at the level of histories and states, and those that exist at the level of histories and not states. This distinction resolves a tension in the literature concerning the physical interpretation of the partial observables and allows for a richer class of observables in the quantum theory. There is the potential for the application of our proposal to the quantization of gravity when understood in terms of the Shape Dynamics formalism. (shrink)
A fundamental tenet of Paul Feyerabend’s pluralistic view of science has it that theory proliferation, that is, the availability of theoretical alternatives, is of crucial importance for the detection of anomalies in established theories. Paul Hoyningen-Huene calls this the Anomaly Importation Thesis, according to which anomalies are imported, as it were, into well-established theories from competing alternatives. This article pursues two major objectives: (a) to work out the systematic details of Feyerabend’s ideas on theory proliferation and anomaly import as they (...) are presented in his early publications and his Against Method and (b) to compare Feyerabend’s ideas on theory proliferation and anomaly import with corresponding features in Popper’s critical rationalist philosophy of science. As it turns out, neither the Principle of Proliferation nor the Anomaly Importation Thesis are necessarily incompatible with critical rationalism. In spite of Feyerabend’s general anti-Popperian attitude, I argue that theoretical pluralism can be seen as an advancement of the critical rationalist philosophy and that critical rationalism provides good arguments for pluralism. (shrink)
There is a distinct lack of research into the relationship between corporate governance and corporate social responsibility (CSR) in the banking sector. This paper fills the gap in the literature by examining the impact of corporate governance, with particular reference to the role of board of directors, on the quality of CSR disclosure in US listed banks’ annual reports after the US sub-prime mortgage crisis. Using a sample of large US commercial banks for the period 2009–2011 and controlling for audit (...) committee characteristics, board meeting frequency, and banks’ profitability, size and risk, we find evidence that board independence and board size, the two board characteristics usually associated with the protection of shareholder interests, are positively related to CSR disclosure. This indicates that, with regard to CSR disclosure, more independent boards of directors and larger boards are the internal corporate governance mechanisms which promote both shareholders’ and other stakeholders’ interests. Contrary to our expectations, CEO duality also impacts positively on CSR disclosure. From an agency-theoretical viewpoint, this suggests that powerful CEOs may promote transparency about banks’ CSR activities for their private benefits. While this could indicate that powerful CEOs are under particular pressure to appease stakeholders’ concerns that they might abuse their power by providing a high degree of CSR disclosure, it could also be a sign of managerial risk aversion or managers’ private reputational concerns. (shrink)
The intuition that we can think about non-existent objects seems to be in tension with philosophical concerns about the relationality of intentionality. Tim Crane’s psychologism removes this tension by proposing a psychologistic account of intentionality according to which intentionality is a purely non-relational notion. I argue that his account has counterintuitive consequences regarding our thoughts about existing objects, and as such is insufficiently plausible to convince us to reject the relationality of intentionality.
Embodied and extended cognition is a relatively new paradigm within cognitive science that challenges the basic tenet of classical cognitive science, viz. cognition consists in building and manipulating internal representations. Some of the pioneers of embodied cognitive science have claimed that this new way of conceptualizing cognition puts pressure on epistemological and ontological realism. In this paper I will argue that such anti-realist conclusions do not follow from the basic assumptions of radical embodied cognitive science. Furthermore I will show that (...) one can develop a form of realism that reflects rather than just accommodates the core principles of non-representationalist embodied cognitive science. (shrink)
We propose a solution to the problem of time for systems with a single global Hamiltonian constraint. Our solution stems from the observation that, for these theories, conventional gauge theory methods fail to capture the full classical dynamics of the system and must therefore be deemed inappropriate. We propose a new strategy for consistently quantizing systems with a relational notion of time that does capture the full classical dynamics of the system and allows for evolution parametrized by an equitable internal (...) clock. This proposal contains the minimal temporal structure necessary to retain the ordering of events required to describe classical evolution. In the context of shape dynamics (an equivalent formulation of general relativity that is locally scale invariant and free of the local problem of time) our proposal can be shown to constitute a natural methodology for describing dynamical evolution in quantum gravity and to lead to a quantum theory analogous to the Dirac quantization of unimodular gravity. (shrink)
The analysis of the temporal structure of canonical general relativity and the connected interpretational questions with regard to the role of time within the theory both rest upon the need to respect the fundamentally dual role of the Hamiltonian constraints found within the formalism. Any consistent philosophical approach towards the theory must pay dues to the role of these constraints in both generating dynamics, in the context of phase space, and generating unphysical symmetry transformations, in the context of a hypersurface (...) embedded within a solution. A first denial of time in the terms of a position of reductive temporal relationalism can be shown to be troubled by failure on the first count, and a second denial in the terms of Machian temporal relationalism can be found to be hampered by failure on the second. A third denial of time, consistent with both of the Hamiltonian constraints roles, is constituted by the implementation of a scheme for constructing observables in terms of correlations and leads to a radical Parmenidean timelessness. The motivation for and implications of each of these three denials are investigated. (shrink)
In 1981 Unruh proposed that fluid mechanical experiments could be used to probe key aspects of the quantum phenomenology of black holes. In particular, he claimed that an analogue to Hawking radiation could be created within a fluid mechanical `dumb hole', with the event horizon replaced by a sonic horizon. Since then an entire sub-field of `analogue gravity' has been created. In 2016 Steinhauer reported the experimental observation of quantum Hawking radiation and its entanglement in a Bose-Einstein condensate analogue black (...) hole. What can we learn from such analogue experiments? In particular, in what sense can they provide evidence of novel phenomena such as black hole Hawking radiation? (shrink)
The professions have focused considerable attention on developing codes of conduct. Despite their efforts there is considerable controversy regarding the propriety of professional codes of ethics. Many provisions of professional codes seem to exacerbate disputes between the profession and the public rather than providing a framework that satisfies the public''s desire for moral behavior.After examining three professional codes, we divide the provisions of professional codes into those provisions which urge professionals to avoid moral hazard, maintain professional courtesy and serve the (...) public interest. We note that whereas provisions urging the avoidance of moral hazard are uncontroversial, the public is suspicious of provisions protecting professional courtesy. Public interest provisions are controversial when the public and the profession disagree as to what is in the public interest. Based on these observations, we conclude with recommendations regarding the content of professional codes. (shrink)
A small but growing number of studies have aimed to understand, assess and reduce existential risks, or risks that threaten the continued existence of mankind. However, most attention has been focused on known and tangible risks. This paper proposes a heuristic for reducing the risk of black swan extinction events. These events are, as the name suggests, stochastic and unforeseen when they happen. Decision theory based on a fixed model of possible outcomes cannot properly deal with this kind of event. (...) Neither can probabilistic risk analysis. This paper will argue that the approach that is referred to as engineering safety could be applied to reducing the risk from black swan extinction events. It will also propose a conceptual sketch of how such a strategy may be implemented: isolated, self-sufficient, and continuously manned underground refuges. Some characteristics of such refuges are also described, in particular the psychosocial aspects. Furthermore, it is argued that this implementation of the engineering safety strategy safety barriers would be effective and plausible and could reduce the risk of an extinction event in a wide range of possible scenarios. Considering the staggering opportunity cost of an existential catastrophe, such strategies ought to be explored more vigorously. (shrink)
We introduce ‘model migration’ as a species of cross-disciplinary knowledge transfer whereby the representational function of a model is radically changed to allow application to a new disciplinary context. Controversies and confusions that often derive from this phenomenon will be illustrated in the context of econophysics and phylogeographic linguistics. Migration can be usefully contrasted with concept of ‘imperialism’, that has been influentially discussed in the context of geographical economics. In particular, imperialism, unlike migration, relies upon extension of the original model (...) via an expansion of the domain of phenomena it is taken to adequately described. The success of imperialism thus requires expansion of the justificatory sanctioning of the original idealising assumptions to a new disciplinary context. Contrastingly, successful migration involves the radical representational re-interpretation of the original model, rather than its extension. Migration thus requires ‘re-sanctioning’ of new ‘counterpart idealisations’ to allow application to an entirely different class of phenomena. Whereas legitimate scientific imperialism should be based on the pursuit of some form of ontological unification, no such requirement is need to legitimate the practice of model migration. The distinction between migration and imperialism will thus be shown to have significant normative as well as descriptive value. (shrink)
The study contributes to building an understanding of the impact of political forces on the information environment of listed firms in a developing economy. Specifically, it investigates the tensions between politico-institutional factors and accounting regulation on the prolonged and incomplete implementation of the International Financial Reporting Standards in Bangladesh from 1998 to 2010. Two phases of interviews were conducted in 2010–2011 and IFRS-related enforcement documents from 1998 to 2010 were evaluated. The study contributes that IFRSs are being diffused to developing (...) countries like Bangladesh, but they invariably interact with local institutions, with variable outcomes. Coercive, normative and mimetic isomorphisms are low in Bangladesh. Notably, political forces have been undermining mimetic isomorphism because of the high level of government intervention and the high level of political lobbying. Political institutional pressures stand in the way of mimetic isomorphism and constitute negative forces that add further tension to accounting regulation in Bangladesh. Regarding the low level of normative isomorphism, there is evidence of a ‘blame culture’, with state institutions and professional accountancy institutions in the country blaming each other for the poor progress in IFRS implementation. Although the study focuses on Bangladesh, its results have implications for international policy makers, as well as the governments and regulators of other developing economies facing similar challenges in implementing IFRS. (shrink)
In this article we argue for the existence of ‘analogue simulation’ as a novel form of scientific inference with the potential to be confirmatory. This notion is distinct from the modes of analogical reasoning detailed in the literature, and draws inspiration from fluid dynamical ‘dumb hole’ analogues to gravitational black holes. For that case, which is considered in detail, we defend the claim that the phenomena of gravitational Hawking radiation could be confirmed in the case that its counterpart is detected (...) within experiments conducted on diverse realizations of the analogue model. A prospectus is given for further potential cases of analogue simulation in contemporary science. 1 Introduction2 Physical Background2.1 Hawking radiation in semi-classical gravity2.2 Modelling sound in fluids2.3 The acoustic analogue model of Hawking radiation3 Simulation and Analogy in Physical Theory3.1 Analogical reasoning and analogue simulation3.2 Confirmation via analogue simulation3.3 Recapitulation4 The Sound of Silence: Analogical Insights into Gravity4.1 Experimental realization of analogue models4.2 Universality and the Hawking effect4.3 Confirmation of gravitational Hawking radiation5 Prospectus. (shrink)
Brain machine interface (BMI) technology makes direct communication between the brain and a machine possible by means of electrodes. This paper reviews the existing and emerging technologies in this field and offers a systematic inquiry into the relevant ethical problems that are likely to emerge in the following decades.
As we learn more about the human brain, novel biotechnological means to modulate human behaviour and emotional dispositions become possible. These technologies could be used to enhance our morality. Moral bioenhancement, an instance of human enhancement, alters a person’s dispositions, emotions or behaviour in order to make that person more moral. I will argue that moral bioenhancement could be carried out in three different ways. The first strategy, well known from science fiction, is behavioural enhancement. The second strategy, favoured by (...) prominent defenders of moral bioenhancement, is emotional enhancement. The third strategy is the enhancement of moral dispositions, such as empathy and inequity aversion. I will argue that we ought to implement a combination of the second and third strategies. Furthermore, I will argue that the usual arguments against other instances of human enhancement do not apply to moral bioenhancement, or apply only to the first strategy, behavioural enhancement. (shrink)
This article will explore a problem which is related to our moral obligations towards species. Although the re-creation of extinct animals has been discussed to some degree both in lay deliberations as well as by scientists, advocates tend to emphasize the technological and scientific value of such an endeavour, and the “coolness” factor, 32–33, 2013). This article will provide an argument in favour of re-creation based on normative considerations. The environmentalist community generally accepts that it is wrong to exterminate species, (...) for reasons beyond any instrumental value these species may have. It is often also claimed that humanity has a collective responsibility to either preserve or at least to not exterminate species. These two beliefs are here assumed to be correct. The argument presented here departs from and places these two ideas in a deontological framework, from which it is argued that when humanity causes the extinction of a species, this is a moral transgression, entailing a residual obligation. Such an obligation implies a positive duty to mitigate any harm caused by our moral failure. In light of recent scientific progress in the field of genetic engineering, it will be argued that humanity has a prima facie obligation to re-create species whose extinction mankind may have caused, also known as de-extinction. (shrink)
The generalized theory of evolution suggests that evolutionary algorithms apply to biological and cultural processes like language alike. Variation, selection and reproduction constitute abstract and formal traits of complex, open and often self-regulating systems. Accepting this basic assumption provides us with a powerful background methodology for this investigation: explaining the emergence and proliferation of semantic patterns, that become conventional. A teleosemantic theory of public (conventional) meaning (Millikan 1984; 2005) grounded in a generalized theory of evolution explains the proliferation of public (...) language forms in terms of their adaptive proper function. It has also been suggested, that the emergence of meaning, can be formalized with game-theoretical tools (Skyrms 2010) within signaling systems of coordination. I want to show how closely related these approaches are, both in terms of explanandum and of outcomes. To put it in a nutshell: If the emergence of public meaning can be satisfyingly explained in terms of signaling games, then the cultural evolutionary dynamics will serve as an adequate model to describe their proliferation. Public or conventional meaning (in contrast to personal meaning) can be fully understood in terms of its evolutionary function in a population of communicators. Furthermore, I want to argue how this understanding of conventional meaning could lead us to a strong semantic holism. (shrink)
Since the publication of Clark and Chalmers' Extended Mind paper, the central claims of that paper, viz. the thesis that cognitive processes and cognitive or mental states extend beyond the brain and body, have been vigorously debated within philosophy of mind and philosophy of cognitive science. Both defenders and detractors of these claims have since marshalled an impressive battery of arguments for and against “active externalism.” However, despite the amount of philosophical energy expended, this debate remains far from settled. We (...) argue that this debate can be understood as answering two metaphysical questions. Yet prominent voices within the debate have assumed that there is a tight relationship between these two questions such that one question can be answered via the other. We defend an alternative ‘wide’ view, whereby mentality is understood as constituted by wide social and cultural factors. Our wide view entails that the two metaphysical questions are separate and should be kept distinct. This suggests that active externalism as understood by prominent voices within that debate requires dissolution, rather than solution. However, if the debate were instead understood as only focusing on the second of the two questions, then there could be a possible future for this debate. (shrink)
Retractions solicited by authors following the discovery of an unintentional error—what we henceforth call a “self-retraction”—are a new phenomenon of growing importance, about which very little is known. Here we present results of a small qualitative study aimed at gaining preliminary insights about circumstances, motivations and beliefs that accompanied the experience of a self-retraction. We identified retraction notes that unambiguously reported an honest error and that had been published between the years 2010 and 2015. We limited our sample to retractions (...) with at least one co-author based in the Netherlands, Belgium, United Kingdom, Germany or a Scandinavian country, and we invited these authors to a semi-structured interview. Fourteen authors accepted our invitation. Contrary to our initial assumptions, most of our interviewees had not originally intended to retract their paper. They had contacted the journal to request a correction and the decision to retract had been made by journal editors. All interviewees reported that having to retract their own publication made them concerned for their scientific reputation and career, often causing considerable stress and anxiety. Interviewees also encountered difficulties in communicating with the journal and recalled other procedural issues that had unnecessarily slowed down the process of self-retraction. Intriguingly, however, all interviewees reported how, contrary to their own expectations, the self-retraction had brought no damage to their reputation and in some cases had actually improved it. We also examined the ethical motivations that interviewees ascribed, retrospectively, to their actions and found that such motivations included a combination of moral and prudential considerations. These preliminary results suggest that scientists would welcome innovations to facilitate the process of self-retraction. (shrink)
We give in this paper a short semantical proof of the strong normalization for full propositional classical natural deduction. This proof is an adaptation of reducibility candidates introduced by J.-Y. Girard and simplified to the classical case by M. Parigot.
International marketing practices, embedded in a strong ethical doctrine, can play a vital role in raising the standards of business conduct worldwide, while in no way compromising the quality of services or products offered to customers, or surrendering the profit margins of businesses. Adherence to such ethical practices can help to elevate the standards of behavior and thus of living, of traders and consumers alike. Against this background, this paper endeavors to identify the salient features of the Islamic framework of (...) International Marketing Ethics. In particular, it highlights the capabilities and strengths of this framework in creating and sustaining a strong ethical international marketing culture. At the heart of Islamic marketing is the principle of value-maximization based on equity and justice (constituting just dealing and fair play) for the wider welfare of the society. Selected key international marketing issues are examined from an Islamic perspective which, it is argued, if adhered to, can help to create a value-loaded global ethical marketing framework for MNCs in general, and establish harmony and meaningful cooperation between international marketers and Muslim target markets in particular. (shrink)
We present a novel procedure to engage the public in ethical deliberations on the potential impacts of brain machine interface technology. We call this procedure a convergence seminar, a form of scenario-based group discussion that is founded on the idea of hypothetical retrospection. The theoretical background of this procedure and the results of five seminars are presented.
The subjective Everettian approach to quantum mechanics presented by Deutsch and Wallace fails to constitute an empirically viable theory of quantum phenomena. The decision theoretic implementation of the Born rule realized in this approach provides no basis for rejecting Everettian quantum mechanics in the face of empirical data that contradicts the Born rule. The approach of Greaves and Myrvold, which provides a subjective implementation of the Born rule as well but derives it from empirical data rather than decision theoretic arguments, (...) avoids the problem faced by Deutsch and Wallace and is empirically viable. However, there is good reason to cast doubts on its scientific value. (shrink)