Weak measurement devices resemble band pass filters: they strengthen average values in the state space or equivalently filter out some ‘frequencies’ from the conjugate Fourier transformed vector space. We thereby adjust a principle of classical communication theory for the use in quantum computation. We discuss some of the computational benefits and limitations of such an approach, including complexity analysis, some simple examples and a realistic not-so-weak approach.
This paper analyses the methodological ideas of Sergei Vsekhsvyatskii’s Studies on Philosophical Issues of Cosmology and Cosmogony. The article examines the background and history of the development of astronomy and cosmology in Ukraine and its gradual transition from a descriptive method to mathematical analysis. The authors have studied the influence of Ukrainian scholars and philosophers on studies in cosmology, astronomy, philosophical issues in cosmology, and computational cosmology. The philosophical understanding of cosmology and cosmogony is always a search for alternative (...) views on generally accepted ideas, not intending to deny the merits of other sciences but to critically analyse the conclusions of the natural sciences, made based on empirical data and mathematical calculations. The contribution to global knowledge by Ukrainian astronomers, astrophysicists, cosmologists, and other scientists who work on questions about the Universe has been studied. It is indicated in which modern research areas the main contributions were made (dark matter, dark energy, gravitational lensing). These contemporary problems of cosmology go beyond the usual field of understanding of the natural sciences, which actualises the possibility of an interdisciplinary dialogue between physicists and philosophers on the problems of cosmology and cosmogony. Using the example of Sergei Vsekhsvyatskii’s works, the authors showed how philosophical analysis can reveal the weaknesses of the methodology and ask key questions to understand the essence of the processes of space. In the 1960s, he investigated the density of space objects to find new ways to solve the cosmogonic processes of the past. However, even today, the density of the Universe raises major questions. Even though science has made a huge leap in half a century, the essence of scientific and philosophical search remains the same. (shrink)
The films of Sergei Parajanov remain some of the most stylistically unique in the history of the medium and easily place him within the pantheon of the world's great filmmakers. This article offers a new perspective on Parajanov's art through a detailed examination of the two works at the center of his oeuvre, The Colour of Pomegranates and The Legend of Suram Fortress. In addition to their undeniable aesthetic value, these films may be appreciated as meaningful discourse on our (...) conceptions of time, perception, and identity. Like Parajanov's other films, they dismantle the perceptual and narrative structure of classical cinema in order to stimulate awareness of an expressly raw layer of reality beneath what we customarily take to be static, indivisible essences or identities. With specific attention to the correlation of difference, repetition, and perception, this article also focuses on the effects this presentation of perpetual flux and variation has on consciousness and subje... (shrink)
Tracing how the logic of inoperativity works in the domains of language, law, history and humanity, 'Agamben and Politics' systematically introduces the fundamental concepts of Agamben's political thought and a critically interprets his insights in the wider context of contemporary philosophy.
We compare the logic HYPE recently suggested by H. Leitgeb as a basic propositional logic to deal with hyperintensional contexts and Heyting-Ockham logic introduced in the course of studying logical aspects of the well-founded semantics for logic programs with negation. The semantics of Heyting-Ockham logic makes use of the so-called Routley star negation. It is shown how the Routley star negation can be obtained from Dimiter Vakarelov’s theory of negation and that propositional HYPE coincides with the logic characterized by the (...) class of all involutive Routley star information frames. This result provides a much simplified semantics for HYPE and also a simplified axiomatization, which shows that HYPE is identical with the modal symmetric propositional calculus introduced by G. Moisil in 1942. Moreover, it is shown that HYPE can be faithfully embedded into a normal bi-modal logic based on classical logic. Against this background, we discuss the notion of hyperintensionality. (shrink)
Sergei Prozorov challenges the assumption that the biopolitical governance means the end of democracy, arguing for a positive synthesis of biopolitics and democracy. He develops a vision of democratic biopolitics where diverse forms of life can coexist on the basis of their reciprocal recognition as free, equal and in common.
This article examines a still contentious question: how conservative and liberal elements are combined in Boris N. Chicherin’s worldview and political doctrine. It considers several points of view...
'Ontological Semantics' introduces a comprehensive approach to the treatment of text meaning by computer, arguing that being able to use meaning is crucial to the success of natural language processing applications.
A paedophile is a person with a sexual attraction to children; some paedophiles commit child sex abuse offences. For such acts, they hold moral and legal responsibility, which presupposes that paedophiles are moral agents who can distinguish right from wrong and are capable of self-control. Like any other moral agents, paedophiles have moral duties. Some moral duties are universal, e.g., the duty not to steal. Whether there are any specific moral duties related to paedophilia is the topic of this paper. (...) I argue that the moral duty not to commit child sex abuse is universal, and the duty to reduce the individual risk of child sex abuse is specific to paedophiles. I further argue that any society has a moral duty to help paedophiles reduce the risk. Both duties provide grounds for moral judgement. Paedophiles should be judged not for their sexual interest but for their efforts to avoid child sex abuse. If a paedophile has an opportunity to reduce the risk of child sex abuse, he is obliged to do so. Unfortunately, societies rarely provide such opportunities and hence fail in their moral duty to paedophiles and children. (shrink)
Why does the theory of law have such a significant role in Russian liberalism, and how is this related to the state of the legal system in Russia and to the public’s legal consciousness? This intro...
This paper commemorates thepresentation of the honorary doctorate, in May2001 by the University of ód, toProfessor Andrzej Walicki. On this occasion,the Honorary Graduate delivered a lecturedevoted to his first philosophy teacher –Sergej Iosifovich Hessen, a prominent RussianNeo-Kantian philosopher and a liberal inmatters social and political. I try to analyzethe main features of Hessen''s philosophicalneo-Kantianism, in particular the inevitabilityof a choice between the absolute and therelative both in epistemology and in ethics inthe context of contemporary philosophy.
Here is an account of recent investigations into the two main concepts of negation developed in the constructive logic: the negation as reduction to absurdity, and the strong negation. These concepts are studied in the setting of paraconsistent logic.
The relationships between various modal logics based on Belnap and Dunn’s paraconsistent four-valued logic FDE are investigated. It is shown that the paraconsistent modal logic \, which lacks a primitive possibility operator \, is definitionally equivalent with the logic \, which has both \ and \ as primitive modalities. Next, a tableau calculus for the paraconsistent modal logic KN4 introduced by L. Goble is defined and used to show that KN4 is definitionally equivalent with \ without the absurdity constant. Moreover, (...) a tableau calculus is defined for the modal bilattice logic MBL introduced and investigated by A. Jung, U. Rivieccio, and R. Jansana. MBL is a generalization of BK that in its Kripke semantics makes use of a four-valued accessibility relation. It is shown that MBL can be faithfully embedded into the bimodal logic \ over the non-modal vocabulary of MBL. On the way from \ to MBL, the Fischer Servi-style modal logic \ is defined as the set of all modal formulas valid under a modified standard translation into first-order FDE, and \ is shown to be characterized by the class of all models for \. Moreover, \ is axiomatized and this axiom system is proved to be strongly sound and complete with respect to the class of models for \. Moreover, the notion of definitional equivalence is suitably weakened, so as to show that \ and \ are weakly definitionally equivalent. (shrink)
The language of the basic logic of proofs extends the usual propositional language by forming sentences of the sort x is a proof of F for any sentence F. In this paper a complete axiomatization for the basic logic of proofs in Heyting Arithmetic HA was found.
I argue that teaching evaluation tools may function as ethical codes, and answer certain demands that ECs cannot sufficiently fulfill. In order to be viable, an EC related to the teaching profession must assume a different form, and such a form is already present in several of the contemporary TETs. The TET matrix form allows for certain features that are incompatible with the EC form. The TET’s benchmark scale – ranging from below the acceptable to the ideal level across the (...) dimensions of teachers’ competence – allows it to answer the demands upon ECs, raised by the EC critics. TETs embrace both minimal requirements and lofty ideals, while ECs of different types have a hard time combining these two features. TETs relate to different dimensions of teacher competence, connecting ethical and professional components with didactical, managerial and cognitive components, without lumping them together. (shrink)
The work aims to demonstrate that at the heart of Eriugena’s approach to Christian theology there lies a profoundly philosophical interest in the necessity of a cardinal shift in the paradigms of thinking – namely, that from the metaphysical to the dialectical one, which wins him a reputation of the ‘Hegel of the ninth century,’ as scholars in Post-Hegelian Germany called him. The prime concern of Eriugena’s discourse is to prove that the actual adoption of the salvific truth of Christ’s (...) revelation about all humans’ Sonship to God directly depends on the way the truth of God’s Oneness is consistently thought of. It is exactly the dialectic of the universal and particular which allows Eriugena both to tackle the dichotomy between being and non-being and to proceed towards raising a question how the totality of God’s being can be approached so as to let him radically reconsider a predominantly metaphysical view of creation the theological reflection traditionally relies on. According to the dialectical understanding of unity that Eriugena does adhere to, the reality of creation cannot be thought of, and therefore known, otherwise than in the way of being inseparable from the universal Principle of all. This is the Principle abandoned by nothing, unless the mind corrupted by the senses thinks otherwise and, following the metaphysical pattern of dichotomy, improperly sets the creation and its Principle apart. Restoration of the mind to the proper rational motion of recta ratio freed, as Eriugena argues, from the dictates of senses therefore becomes the way of both the epistemological breakthrough to the infinite whole and practical return from the world of finite things to living in the divine reality of creation. The work’s argument is based on the assumption of close affinity between Eriugena’s discourse and that of his Islamic contemporaries, who developed their dialectical ideas within the Mu’tazilah tradition of a philosophically disciplined approach to the truth of God’s Oneness. In particular, al-Nazzam’s engagement with Parmenides’ Periphyseon and his resistance to the danger of a dualistic interpretation of its ontology seem to provoke Eriugena’s innovative approach to Christian theology with a view to suggesting a mode of overcoming dualism as the main obstacle on the way to the Truth revealed. This vision of the meaning of Eriugena’s undertaking allows us not only to better understand the novelty of his approach to Christian theology, but also reconsider some of the key points of his discourse that seem to have become a sort of commonplace in Eriugenian studies: 1. Unlike the prevalent opinion, not the forms of the division of Nature but the modes of interpreting being and non-being are to be understood to constitute the genuine subject-matter of each book of the Periphyseon and, hence, of the five parts of his system. 2. The fourfold division of Nature is to be interpreted not as a basic structure of the system offered by Eriugena, but as a means of introducing dialectic to the body of theology by refuting Augustine’s metaphysical vision of a hierarchical model of the universe and indicating the way of resolution of the cardinally theological contradiction – God does and does not create at the same time. 3. All this gives reason to disagree with a general tendency of associating Eriugena’s work with exploration of the division of God’s Nature and to reinterpret it as an immense anti-division project to be understood as an important turn in the history of Christian thought entirely focused on the truth of God’s Oneness and human life in conformity to it. *** I affirm that this thesis is entirely my own work and has not been submitted for examination in any form elsewhere. (shrink)
In 1933 Godel introduced a calculus of provability (also known as modal logic S4) and left open the question of its exact intended semantics. In this paper we give a solution to this problem. We find the logic LP of propositions and proofs and show that Godel's provability calculus is nothing but the forgetful projection of LP. This also achieves Godel's objective of defining intuitionistic propositional logic Int via classical proofs and provides a Brouwer-Heyting-Kolmogorov style provability semantics for Int which (...) resisted formalization since the early 1930s. LP may be regarded as a unified underlying structure for intuitionistic, modal logics, typed combinatory logic and λ-calculus. (shrink)
This paper sheds light on the relationship between the logic of generalized truth values and the logic of bilattices. It suggests a definite solution to the problem of axiomatizing the truth and falsity consequence relations, \ and \ , considered in a language without implication and determined via the truth and falsity orderings on the trilattice SIXTEEN 3 . The solution is based on the fact that a certain algebra isomorphic to SIXTEEN 3 generates the variety of commutative and distributive (...) bilattices with conflation. (shrink)
We describe a general logical framework, Justification Logic, for reasoning about epistemic justification. Justification Logic is based on classical propositional logic augmented by justification assertions t: F that read t is a justification for F. Justification Logic absorbs basic principles originating from both mainstream epistemology and the mathematical theory of proofs. It contributes to the studies of the well-known Justified True Belief vs. Knowledge problem. We state a general Correspondence Theorem showing that behind each epistemic modal logic, there is a (...) robust system of justifications. This renders a new, evidence-based foundation for epistemic logic. As a case study, we offer a resolution of the GoldmanRed Barns in Justification Logic. Furthermore, we formalize the well-known Gettier example and reveal hidden assumptions and redundancies in Gettier’s reasoning. (shrink)
Early normative studies of human behavior revealed a gap between the norms of practical rationality (what humans ought to do) and the actual human behavior (what they do). It has been suggested that, to close the gap between the descriptive and the normative, one has to revise norms of practical rationality according to the Quinean, engineering view of normativity. On this view, the norms must be designed such that they effectively account for behavior. I review recent studies of human perception (...) which pursued normative modeling and which found good agreement between the normative prescriptions and the actual behavior. I make the case that the goals and methods of this work have been incompatible with those of the engineering approach. I argue that norms of perception and action are observer-independent properties of biological agents; the norms are discovered using methods of natural sciences rather than the norms are designed to fit the observed behavior. (shrink)
This essay is an explication and analysis of the work of Sergei Kotliarevskii, a major Russian liberal theorist, focusing on his 1915 treatise Vlast’ i pravo. Problema pravovogo gosudarstva (Power and Law: The Problem of the Lawful State). Although the “lawful state” has long been a subject of interest and controversy (even at the definitional level) among historians and political scientists, curiously Kotliarevskii has not received the attention he deserves. His study of the concept of the lawful state, which (...) for him was integrally related to the ideal of the rule of law, is an important Russian contribution to the history and philosophy of law and the state. This essay explores the philosophical sources and contexts of his work; his understanding of the relationship among power, law, and thestate; his thesis that religious ideas and institutions were most important in the historical development of legal consciousness; his consideration of the modern constitutional state; and his conviction that personhood—the absolute value and dignity of the human person—was the ultimate justification for the rule of law. (shrink)
Two traditions have had a great impact on the theoretical and experimental research of perception. One tradition is statistical, stretching from Fechner's enunciation of psychophysics in 1860 to the modern view of perception as statistical decision making. The other tradition is phenomenological, from Brentano's “empirical standpoint” of 1874 to the Gestalt movement and the modern work on perceptual organization. Each tradition has at its core a distinctive assumption about the indivisible constituents of perception: the just-noticeable differences of sensation in the (...) tradition of Fechner vs. the phenomenological Gestalts in the tradition of Brentano. But some key results from the two traditions can be explained and connected using an approach that is neither statistical nor phenomenological. This approach rests on a basic property of any information exchange: a principle of measurement formulated in 1946 by Gabor as a part of his quantal theory of information. Here the indivisible components are units (quanta) of information that remain invariant under changes of precision of measurement. This approach helped to understand how sensory measurements are implemented by single neural cells. But recent analyses suggest that this approach has the power to explain larger-scale characteristics of sensory systems. (shrink)
N4-lattices provide algebraic semantics for the logic N4, the paraconsistent variant of Nelson's logic with strong negation. We obtain the representation of N4-lattices showing that the structure of an arbitrary N4-lattice is completely determined by a suitable implicative lattice with distinguished filter and ideal. We introduce also special filters on N4-lattices and prove that special filters are exactly kernels of homomorphisms. Criteria of embeddability and to be a homomorphic image are obtained for N4-lattices in terms of the above mentioned representation. (...) Finally, subdirectly irreducible N4-lattices are described. (shrink)
This work treats the problem of axiomatizing the truth and falsity consequence relations, $ \vDash _t $ and $ \vDash _f $, determined via truth and falsity orderings on the trilattice SIXTEEN₃. The approach is based on a representation of SIXTEEN₃ as a twist-structure over the two-element Boolean algebra.
The main varieties of scientific misconduct are fabrication, falsification, misquoting and plagiarism. Considering the "improvement" of fraudulent skills, scientists, editors, and authorities must jointly combat the misconduct. Also, it is important that whistleblowers must be protected from revenge. The response to scientific misconduct requires national and international bodies to provide leadership and guidelines. Whistleblowers need a safe, confidential place to report misconduct. The quality of research and hidden conflicts of interest should be taken into account deciding which studies are to (...) be included into reviews. Forged publications and speculative theories have been used for promotion of drugs, dietary supplements and treatments without proven effectiveness. Marketing of placebos in the guise of evidence-based medications seems to be on the increase. Patients can be misinformed not only by the advertising but also by publications supposed to be scientific. Furthermore, it has become usual practice to disregard published criticism in spite of personal communications and debates at conferences. Some scientists seem to make use of critical comments without citing them, or just continue publications ignoring the criticism. The same scientists continue working sometimes in cooperation with renowned researchers; and it is possible that some later articles are more reliable than earlier ones. However, it is insufficient to hope that reliable publications would be shortly confirmed while forgeries would fall into oblivion. Fake papers are misleading for research and practice, cost time and money. Wrong concepts are persisting and reappearing, which may result in useless experimentation and application of invasive methods without sufficient indications. An international cooperation of bona fide scientists, editors and authorities is needed to eradicate the scientific misconduct and fraude in medicine. The book contains an overview of misconduct in medical research and practice mainly from the former Soviet Union. Ample documentary evidence is provided as illustrations. (shrink)