Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the idea arises of a dual (...) logic of partitions. That dual logic is described here. Partition logic is at the same mathematical level as subset logic since models for both are constructed from (partitions on or subsets of) arbitrary unstructured sets with no ordering relations, compatibility or accessibility relations, or topologies on the sets. Just as Boole developed logical finite probability theory as a quantitative treatment of subset logic, applying the analogous mathematical steps to partition logic yields a logical notion of entropy so that information theory can be refounded on partition logic. But the biggest application is that when partition logic and the accompanying logical information theory are "lifted" to complex vector spaces, then the mathematical framework of quantum mechanics is obtained. Partition logic models indefiniteness (i.e., numerical attributes on a set become more definite as the inverse-image partition becomes more refined) while subset logic models the definiteness of classical physics (an entity either definitely has a property or definitely does not). Hence partition logic provides the backstory so the old idea of "objective indefiniteness" in QM can be fleshed out to a full interpretation of quantum mechanics. (shrink)
Categorical logic has shown that modern logic is essentially the logic of subsets (or "subobjects"). Partitions are dual to subsets so there is a dual logic of partitions where a "distinction" [an ordered pair of distinct elements (u,u′) from the universe U ] is dual to an "element". An element being in a subset is analogous to a partition π on U making a distinction, i.e., if u and u′ were in different blocks of π. Subset logic leads to finite (...) probability theory by taking the (Laplacian) probability as the normalized size of each subset-event of a finite universe. The analogous step in the logic of partitions is to assign to a partition the number of distinctions made by a partition normalized by the total number of ordered pairs |U|² from the finite universe. That yields a notion of "logical entropy" for partitions and a "logical information theory." The logical theory directly counts the (normalized) number of distinctions in a partition while Shannon's theory gives the average number of binary partitions needed to make those same distinctions. Thus the logical theory is seen as providing a conceptual underpinning for Shannon's theory based on the logical notion of "distinctions.". (shrink)
From a pre-publication review by the late Austrian economist, Don Lavoie, of George Mason University: -/- "The book's radical re-interpretation of property and contract is, I think, among the most powerful critiques of mainstream economics ever developed. It undermines the neoclassical way of thinking about property by articulating a theory of inalienable rights, and constructs out of this perspective a "labor theory of property" which is as different from Marx's labor theory of value as it is from neoclassicism. It traces (...) roots of such ideas in some fascinating and largely forgotten strands of the history of economics. It draws attention to the question of "responsibility" which neoclassicism has utterly lost sight of. It is startlingly fresh in its overall approach, and unusually well written in its presentation. ... It constitutes a better case for its economic democracy viewpoint than anything else in the literature." . (shrink)
Liberal-contractarian philosophies of justice see the unjust systems of slavery and autocracy in the past as being based on coercion—whereas the social order in modern democratic market societies is based on consent and contract. However, the ‘best’ case for slavery and autocracy in the past were consent-based contractarian arguments. Hence, our first task is to recover those ‘forgotten’ apologia for slavery and autocracy. To counter those consent-based arguments, the historical anti-slavery and democratic movements developed a theory of inalienable rights. Our (...) second task is to recover that theory and to consider several other applications of the theory. Finally, the liberal theories of justice expounded by John Rawls and by Robert Nozick are briefly examined from this perspective. (shrink)
After Marx, dissenting economics almost always used 'the labour theory' as a theory of value. This paper develops a modern treatment of the alternative labour theory of property that is essentially the property theoretic application of the juridical principle of responsibility: impute legal responsibility in accordance with who was in fact responsible. To understand descriptively how assets and liabilities are appropriated in normal production, a 'fundamental myth' needs to be cleared away, and then the market mechanism of appropriation can be (...) understood. On the normative side, neoclassical theory represents marginal productivity theory as showing that (a metaphorical version of) the imputation principle is satisfied ('people get what they produce') in competitive enterprises. Since that shows the moral commitment of neoclassical economics to the imputation principle, the labour theory of property is presented here as the actual non-metaphorical application of the imputation principle to property appropriation. The property-theoretic analysis at the firm level shows how the neoclassical (and much heterodox) analysis in terms of 'distributive shares' wholly misframed the basic questions. Finally, the paper shows how the imputation principle (modernised labour theory of property) is systematically violated in the present wage labour system of renting persons. The paper can be seen as taking up the recent challenge posed by Donald Katzner for a dialogue between neoclassical and heterodox microeconomics. (shrink)
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this paper (...) is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states. (shrink)
This paper shows how the classical finite probability theory (with equiprobable outcomes) can be reinterpreted and recast as the quantum probability calculus of a pedagogical or toy model of quantum mechanics over sets (QM/sets). There have been several previous attempts to develop a quantum-like model with the base field of ℂ replaced by ℤ₂. Since there are no inner products on vector spaces over finite fields, the problem is to define the Dirac brackets and the probability calculus. The previous attempts (...) all required the brackets to take values in ℤ₂. But the usual QM brackets <ψ|ϕ> give the "overlap" between states ψ and ϕ, so for subsets S,T⊆U, the natural definition is <S|T>=|S∩T| (taking values in the natural numbers). This allows QM/sets to be developed with a full probability calculus that turns out to be a non-commutative extension of classical Laplace-Boole finite probability theory. The pedagogical model is illustrated by giving simple treatments of the indeterminacy principle, the double-slit experiment, Bell's Theorem, and identical particles in QM/Sets. A more technical appendix explains the mathematics behind carrying some vector space structures between QM over ℂ and QM/Sets over ℤ₂. (shrink)
Modern categorical logic as well as the Kripke and topological models of intuitionistic logic suggest that the interpretation of ordinary “propositional” logic should in general be the logic of subsets of a given universe set. Partitions on a set are dual to subsets of a set in the sense of the category-theoretic duality of epimorphisms and monomorphisms—which is reflected in the duality between quotient objects and subobjects throughout algebra. If “propositional” logic is thus seen as the logic of subsets of (...) a universe set, then the question naturally arises of a dual logic of partitions on a universe set. This paper is an introduction to that logic of partitions dual to classical subset logic. The paper goes from basic concepts up through the correctness and completeness theorems for a tableau system of partition logic. (shrink)
The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean logic (...) of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g., the inclusion-exclusion principle)--just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition. (shrink)
A theory of property needs to give an account of the whole life-cycle of a property right: how it is initiated, transferred, and terminated. Economics has focused on the transfers in the market and has almost completely neglected the question of the initiation and termination of property in normal production and consumption (not in some original state or in the transition from common to private property). The institutional mechanism for the normal initiation and termination of property is an invisible-hand function (...) of the market, the market mechanism of appropriation. Does this mechanism satisfy an appropriate normative principle? The standard normative juridical principle is to assign or impute legal responsibility according to de facto responsibility. It is given a historical tag of being "Lockean" but the basis is contemporary jurisprudence, not historical exegesis. Then the fundamental theorem of the property mechanism is proven which shows that if "Hume's conditions" (no transfers without consent and all contracts fulfilled) are satisfied, then the market automatically satisfies the Lockean responsibility principle, i.e., "Hume implies Locke." As a major application, the results in their contrapositive form, "Not Locke implies Not Hume," are applied to a market economy based on the employment contract. It is shown the production based on the employment contract violates the Lockean principle (all who work in an employment enterprise are de facto responsible for the positive and negative results) and thus Hume's conditions must also be violated in the marketplace (de facto responsible human action cannot be transferred from one person to another—as is readily recognized when and employer and employee together commit a crime). (shrink)
There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory. Information is first defined in terms of sets of distinctions without using any probability measure. When a probability measure is introduced, the logical entropies are simply the values of the probability measure on the sets (...) of distinctions. The compound notions of joint, conditional, and mutual entropies are obtained as the values of the measure, respectively, on the union, difference, and intersection of the sets of distinctions. These compound notions of logical entropy satisfy the usual Venn diagram relationships since they are values of a measure. The uniform transformation into the formulas for Shannon entropy is linear so it explains the long-noted fact that the Shannon formulas satisfy the Venn diagram relations--as an analogy or mnemonic--since Shannon entropy is not a measure on a given set. What is the logic that gives rise to logical information theory? Partitions are dual to subsets, and the logic of partitions was recently developed in a dual/parallel relationship to the Boolean logic of subsets. Boole developed logical probability theory as the normalized counting measure on subsets. Similarly the normalized counting measure on partitions is logical entropy--when the partitions are represented as the set of distinctions that is the complement to the equivalence relation for the partition. In this manner, logical information theory provides the set-theoretic and measure-theoretic foundations for information theory. The Shannon theory is then derived by the transformation that replaces the counting of distinctions with the counting of the number of binary partitions it takes, on average, to make the same distinctions by uniquely encoding the distinct elements--which is why the Shannon theory perfectly dovetails into coding and communications theory. (shrink)
Dramatic changes or revolutions in a field of science are often made by outsiders or 'trespassers,' who are not limited by the established, 'expert' approaches. Each essay in this diverse collection shows the fruits of intellectual trespassing and poaching among fields such as economics, Kantian ethics, Platonic philosophy, category theory, double-entry accounting, arbitrage, algebraic logic, series-parallel duality, and financial arithmetic.
This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy (...) is a probability measure on the information sets, the probability that on two independent trials, a distinction or “dit” of the partition will be obtained. The formula for logical entropy is a new derivation of an old formula that goes back to the early twentieth century and has been re-derived many times in different contexts. As a probability measure, all the compound notions of joint, conditional, and mutual logical entropy are immediate. The Shannon entropy and its compound notions are then derived from a non-linear dit-to-bit transform that re-quantifies the distinctions of a random variable in terms of bits—so the Shannon entropy is the average number of binary distinctions or bits necessary to make all the distinctions of the random variable. And, using a linearization method, all the set concepts in this logical information theory naturally extend to vector spaces in general—and to Hilbert spaces in particular—for quantum logical information theory which provides the natural measure of the distinctions made in quantum measurement. Relatively short but dense in content, this work can be a reference to researchers and graduate students doing investigations in information theory, maximum entropy methods in physics, engineering, and statistics, and to all those with a special interest in a new approach to quantum information theory. (shrink)
This paper shows how the classical finite probability theory (with equiprobable outcomes) can be reinterpreted and recast as the quantum probability calculus of a pedagogical or toy model of quantum mechanics over sets (QM/sets). There have been several previous attempts to develop a quantum-like model with the base field of ℂ replaced by ℤ₂. Since there are no inner products on vector spaces over finite fields, the problem is to define the Dirac brackets and the probability calculus. The previous attempts (...) all required the brackets to take values in ℤ₂. But the usual QM brackets <ψ|ϕ> give the "overlap" between states ψ and ϕ, so for subsets S,T⊆U, the natural definition is <S|T>=|S∩T| (taking values in the natural numbers). This allows QM/sets to be developed with a full probability calculus that turns out to be a non-commutative extension of classical Laplace-Boole finite probability theory. The pedagogical model is illustrated by giving simple treatments of the indeterminacy principle, the double-slit experiment, Bell's Theorem, and identical particles in QM/Sets. A more technical appendix explains the mathematics behind carrying some vector space structures between QM over ℂ and QM/Sets over ℤ₂. (shrink)
Since its formal definition over sixty years ago, category theory has been increasingly recognized as having a foundational role in mathematics. It provides the conceptual lens to isolate and characterize the structures with importance and universality in mathematics. The notion of an adjunction (a pair of adjoint functors) has moved to center-stage as the principal lens. The central feature of an adjunction is what might be called “determination through universals” based on universal mapping properties. A recently developed “heteromorphic” theory about (...) adjoints suggests a conceptual structure, albeit abstract and atemporal, for how new relatively autonomous behavior can emerge within a system obeying certain laws. The focus here is on applications in the life sciences (e.g., selectionist mechanisms) and human sciences (e.g., the generative grammar view of language). (shrink)
There is a fault line running through classical liberalism as to whether or not democratic self-governance is a necessary part of a liberal social order. The democratic and non-democratic strains of classical liberalism are both present today—particularly in America. Many contemporary libertarians and neo-Austrian economists represent the non-democratic strain in their promotion of non-democratic sovereign city-states (startup cities or charter cities). We will take the late James M. Buchanan as a representative of the democratic strain of classical liberalism. Since the (...) fundamental norm of classical liberalism is consent, we must start with the intellectual history of the voluntary slavery contract, the coverture marriage contract, and the voluntary non-democratic constitution (or pactum subjectionis). Next we recover the theory of inalienable rights that descends from the Reformation doctrine of the inalienability of conscience through the Enlightenment (e.g., Spinoza and Hutcheson) in the abolitionist and democratic movements. Consent-based governments divide into those based on the subjects' alienation of power to a sovereign and those based on the citizens' delegation of power to representatives. Inalienable rights theory rules out that alienation in favor of delegation, so the citizens remain the ultimate principals and the form of government is democratic. Thus the argument concludes in agreement with Buchanan that the classical liberal endorsement of sovereign individuals acting in the marketplace generalizes to the joint action of individuals as the principals in their own organizations. (shrink)
Since the pioneering work of Birkhoff and von Neumann, quantum logic has been interpreted as the logic of (closed) subspaces of a Hilbert space. There is a progression from the usual Boolean logic of subsets to the "quantum logic" of subspaces of a general vector space--which is then specialized to the closed subspaces of a Hilbert space. But there is a "dual" progression. The notion of a partition (or quotient set or equivalence relation) is dual (in a category-theoretic sense) to (...) the notion of a subset. Hence the Boolean logic of subsets has a dual logic of partitions. Then the dual progression is from that logic of partitions to the quantum logic of direct-sum decompositions (i.e., the vector space version of a set partition) of a general vector space--which can then be specialized to the direct-sum decompositions of a Hilbert space. This allows the logic to express measurement by any self-adjoint operators rather than just the projection operators associated with subspaces. In this introductory paper, the focus is on the quantum logic of direct-sum decompositions of a finite-dimensional vector space (including such a Hilbert space). The primary special case examined is finite vector spaces over ℤ₂ where the pedagogical model of quantum mechanics over sets (QM/Sets) is formulated. In the Appendix, the combinatorics of direct-sum decompositions of finite vector spaces over GF(q) is analyzed with computations for the case of QM/Sets where q=2. (shrink)
ion turns equivalence into identity, but there are two ways to do it. Given the equivalence relation of parallelness on lines, the #1 way to turn equivalence into identity by abstraction is to consider equivalence classes of parallel lines. The #2 way is to consider the abstract notion of the direction of parallel lines. This paper developments simple mathematical models of both types of abstraction and shows, for instance, how finite probability theory can be interpreted using #2 abstracts as “superposition (...) events” in addition to the ordinary events. The goal is to use the second notion of abstraction to shed some light on the notion of an indefinite superposition in quantum mechanics. (shrink)
There is some consensus among orthodox category theorists that the concept of adjoint functors is the most important concept contributed to mathematics by category theory. We give a heterodox treatment of adjoints using heteromorphisms that parses an adjunction into two separate parts. Then these separate parts can be recombined in a new way to define a cognate concept, the brain functor, to abstractly model the functions of perception and action of a brain. The treatment uses relatively simple category theory and (...) is focused on the interpretation and application of the mathematical concepts. (shrink)
The purpose of this paper is to show that the mathematics of quantum mechanics is the mathematics of set partitions linearized to vector spaces, particularly in Hilbert spaces. That is, the math of QM is the Hilbert space version of the math to describe objective indefiniteness that at the set level is the math of partitions. The key analytical concepts are definiteness versus indefiniteness, distinctions versus indistinctions, and distinguishability versus indistinguishability. The key machinery to go from indefinite to more definite (...) states is the partition join operation at the set level that prefigures at the quantum level projective measurement as well as the formation of maximally-definite state descriptions by Dirac’s Complete Sets of Commuting Operators. This development is measured quantitatively by logical entropy at the set level and by quantum logical entropy at the quantum level. This follow-the-math approach supports the Literal Interpretation of QM—as advocated by Abner Shimony among others which sees a reality of objective indefiniteness that is quite different from the common sense and classical view of reality as being “definite all the way down”. (shrink)
There is a fallacy that is often involved in the interpretation of quantum experiments involving a certain type of separation such as the: double-slit experiments, which-way interferometer experiments, polarization analyzer experiments, Stern-Gerlach experiments, and quantum eraser experiments. The fallacy leads not only to flawed textbook accounts of these experiments but to flawed inferences about retrocausality in the context of delayed choice versions of separation experiments.
Just as the two sides in the Cold War agreed that Western Capitalism and Soviet Communism were "the" two alternatives, so the two sides in the intellectual Great Debate agreed on a common framing of questions with the defenders of capitalism taking one side and Marxists taking the other side of the questions. From the viewpoint of economic democracy (e.g., a labor-managed market economy), this late Great Debate between capitalism and socialism was as misframed as would be an antebellum 'Great (...) Debate' between the private or public ownership of slaves. Even though the Great Debate between capitalism and socialism is now in the dustbin of intellectual history, Marxism still plays an important role in sustaining the misframing of the questions so that the defenders of the present employment system do not have to face the real questions that separate that system from a system of economic democracy. In that sense, Marxism has become the ultimate capitalist tool. (shrink)
In her recent book Private Government, Elizabeth Anderson makes a powerful but pragmatic case against the abuses experienced by employees in conventional corporations. The purpose of this review-essay is to contrast Anderson’s pragmatic critique of many abuses in the employment relation with a principled critique of the employment relationship itself. This principled critique is based on the theory of inalienable rights that descends from the Reformation doctrine of the inalienability of conscience down through the Enlightenment in the abolitionist, democratic, and (...) feminist movements. That theory was the basis for the abolition of the voluntary slavery or self-sale contract, the voluntary non-democratic constitution (pactum subjectionis), and the voluntary coverture marriage contract in today’s democratic countries. When understood in modern terms, that same theory applies as well against the voluntary self-rental or employment contract that is the basis for our current economic system. (shrink)
Instead of the half-century old foundational feud between set theory and category theory, this paper argues that they are theories about two different complementary types of universals. The set-theoretic antinomies forced naïve set theory to be reformulated using some iterative notion of a set so that a set would always have higher type or rank than its members. Then the universal u_{F}={x|F(x)} for a property F() could never be self-predicative in the sense of u_{F}∈u_{F}. But the mathematical theory of categories, (...) dating from the mid-twentieth century, includes a theory of always-self-predicative universals--which can be seen as forming the "other bookend" to the never-self-predicative universals of set theory. The self-predicative universals of category theory show that the problem in the antinomies was not self-predication per se, but negated self-predication. They also provide a model (in the Platonic Heaven of mathematics) for the self-predicative strand of Plato's Theory of Forms as well as for the idea of a "concrete universal" in Hegel and similar ideas of paradigmatic exemplars in ordinary thought. (shrink)
ince the pioneering work of Birkhoff and von Neumann, quantum logic has been interpreted as the logic of subspaces of a Hilbert space. There is a progression from the usual Boolean logic of subsets to the "quantum logic" of subspaces of a general vector space--which is then specialized to the closed subspaces of a Hilbert space. But there is a "dual" progression. The set notion of a partition is dual to the notion of a subset. Hence the Boolean logic of (...) subsets has a dual logic of partitions. Then the dual progression is from that logic of set partitions to the quantum logic of direct-sum decompositions of a general vector space--which can then be specialized to the direct-sum decompositions of a Hilbert space. This allows the quantum logic of direct-sum decompositions to express measurement by any self-adjoint operators. The quantum logic of direct-sum decompositions is dual to the usual quantum logic of subspaces in the same sense that the logic of partitions is dual to the usual Boolean logic of subsets. (shrink)
Liberal thought (in the sense of classical liberalism) is based on the juxtaposition of consent to coercion. Autocracy and slavery were seen as based on coercion whereas today's political democracy and economic 'employment system' are based on consent to voluntary contracts. This paper retrieves an almost forgotten dark side of contractarian thought that based autocracy and slavery on explicit or implicit voluntary contracts. To answer these 'best case' arguments for slavery and autocracy, the democratic and abolitionist movements forged arguments not (...) simply in favour of consent, but arguments that voluntary contracts to legally alienate aspects of personhood were invalid 'even with consent' – which made the underlying rights inherently inalienable. Once understood, those arguments have the perhaps 'unintended consequence' of making the neo-abolitionist case for ruling out today's self-rental contract, the employer-employee contract. The paper has to also retrieve these inalienable rights arguments since they have been largely lost on the Left, not to mention in liberal thought. (shrink)
Classical liberalism is skeptical about governmental organizations "doing good" for people. Instead governments should create the conditions so that people individually (Adam Smith) and in associations (Tocqueville) are empowered to do good for themselves. The market implications of classical liberalism are well-known, but the implications for organizations are controversial. We will take James Buchanan as our guide (with assists from Mill and Dewey). Unpacking the implications of classical liberalism for the "science of associations" (Tocqueville) requires a tour through the intellectual (...) history of the voluntary slavery contract and the voluntary non-democratic constitution. The argument concludes that the classical liberal endorsement of sovereign individuals acting in the marketplace generalizes to the joint action of individuals as the principals in their own organizations and associations. (shrink)
In this chapter I seek to provide a theoretical defense of workplace democracy that is independent from and outside the lineage of Marxist and communist theory. Common to the council movements, anarcho- syndicalism and many other forms of libertarian socialism was the idea “that workers’ self- management was central.” Yet the idea of workers’ control has not been subject to the same theoretical development as Marx’s theory, not to mention capitalist economic theory. This chapter aims to contribute at a theoretical (...) level by providing a justification and defense of self- managed workplaces that is independent of the particular historical tradition of the council movements. There is a clear and definitive case for workplace democracy based on first principles that descends to modern times through the Reformation and Enlightenment in the abolitionist, democratic and feminist movements. By the twentieth century, the arguments had been scattered and lost – like the bones of some ancient beast scattered in a desert – partly due to misconceptions, mental blocks and misinterpretations embodied in Marxism, liberalism and economic theory. When one has worked through some of these intellectual roadblocks, then one may be better able to reassemble the case for workplace democracy from well- known first principles developed in the abolitionist, democratic and feminist movements. (shrink)
This paper shows how the universals of category theory in mathematics provide a model (in the Platonic Heaven of mathematics) for the self-predicative strand of Plato's Theory of Forms as well as for the idea of a "concrete universal" in Hegel and similar ideas of paradigmatic exemplars in ordinary thought. The paper also shows how the always-self-predicative universals of category theory provide the "opposite bookend" to the never-self-predicative universals of iterative set theory and thus that the paradoxes arose from having (...) one theory (e.g., Frege's Paradise) where universals could be either self-predicative or non-self-predicative (instead of being always one or always the other). (shrink)
This paper shows that implicit assumptions about the numeraire good in the Kaldor-Hicks efficiency-equity analysis involve a "same-yardstick" fallacy (a fallacy pointed out by Paul Samuelson in another context). These results have negative implications for cost-benefit analysis, the wealth-maximization approach to law and economics, and other parts of applied welfare economics--as well as for the whole vision of economics based on the "production and distribution of social wealth.".
The purpose of this paper is to show that the dual notions of elements & distinctions are the basic analytical concepts needed to unpack and analyze morphisms, duality, and universal constructions in the Sets, the category of sets and functions. The analysis extends directly to other concrete categories (groups, rings, vector spaces, etc.) where the objects are sets with a certain type of structure and the morphisms are functions that preserve that structure. Then the elements & distinctions-based definitions can be (...) abstracted in purely arrow-theoretic way for abstract category theory. In short, the language of elements & distinctions is the conceptual language in which the category of sets is written, and abstract category theory gives the abstract arrows version of those definitions. (shrink)
Early democratic theorists such as Kant considered the effects of being a servant or, in modern terms, an employee to be so negative that such dependent people should be denied the vote. John Stuart Mill and John Dewey also noted the negative effects of the employment relation on the development of democratic habits and civic virtues but rather than deny the franchise to employees, they pushed for workplace democracy where workers would be a member of their company rather than an (...) employee. In spite of the continuing prevalence of the employment relation and the lack of workplace democracy, this topic now seems to be something of a "third rail" in deliberative democratic theory. (shrink)
Today it would be considered "bad Platonic metaphysics" to think that among all the concrete instances of a property there could be a universal instance so that all instances had the property by virtue of participating in that concrete universal. Yet there is a mathematical theory, category theory, dating from the mid-20th century that shows how to precisely model concrete universals within the "Platonic Heaven" of mathematics. This paper, written for the philosophical logician, develops this category-theoretic treatment of concrete universals (...) along with a new concept to abstractly model the functions of a brain. (shrink)
Evolutionary economics often focuses on the comparison between economic competition and the process of natural selection to select the fitter members of a given population. But that neglects the other "half" of an evolutionary process, the mechanism for the generation of new possibilities that is key to dynamic efficiency. My topic is the process of parallel experimentation which I take to be a process of multiple experiments running concurrently with some form of common goal, with some semi-isolation between the experiments, (...) with benchmarking comparisons made between the experiments, and with the "migration" of discoveries between experiments wherever possible to ratchet up the performance of the group. The thesis is that parallel experimentation is a fundamental dynamic efficiency scheme to enhance and accelerate variation, innovation, and learning in contexts of genuine uncertainty or known ignorance. Within evolutionary biology, this type of parallel experimentation scheme was developed in Sewall Wright's shifting balance theory of evolution. It addressed the rather neglected topic of how a population on a low fitness peak might eventually be able to go "downhill" against selective pressures, traverse a valley of low fitness, and then ascend a higher fitness peak. The theme of parallel experimentation is used to recast and pull together dynamic and pluralistic theories in economics, political theory, philosophy of science, and social learning. (shrink)
In the 1990s, a debate raged across the whole postsocialist world as well as in Western development agencies such as the World Bank about the best approach to the transition from various forms of socialism or communism to a market economy and political democracy. One of the most hotly contested topics was the question of the workplace being organized based on workplace democracy (e.g., various forms of worker ownership) or based on the conventional employer-employee relationship. Well before 1989, many of (...) the socialist countries had started experimenting with various forms of "self-management" operating in more of a market setting, Yugoslavia being the most developed example. Thus one "path to the market" would .. (shrink)
Saunders Mac Lane famously remarked that "Bourbaki just missed" formulating adjoints in a 1948 appendix (written no doubt by Pierre Samuel) to an early draft of Algebre--which then had to wait until Daniel Kan's 1958 paper on adjoint functors. But Mac Lane was using the orthodox treatment of adjoints that only contemplates the object-to-object morphisms within a category, i.e., homomorphisms. When Samuel's treatment is reconsidered in view of the treatment of adjoints using heteromorphisms or hets (object-to-object morphisms between objects in (...) different categories), then he, in effect, isolated the concept of a left representation solving a universal mapping problem. When dualized to obtain the concept of a right representation, the two halves only need to be united to obtain an adjunction. Thus Samuel was only a now-simple dualization away for formulating adjoints in 1948. Apparently, Bodo Pareigis' 1970 text was the first and perhaps only text to give the heterodox "new characterization" (i.e., heteromorphic treatment) of adjoints. Orthodox category theory uses various relatively artificial devices to avoid formally recognizing hets--even though hets are routinely used by the working mathematician. Finally we consider a "philosophical" question as to whether the most important concept in category theory is the notion of an adjunction or the notion of a representation giving a universal mapping property (where adjunctions arise as the special case of a bi-representation of dual universal mapping problems). (shrink)
Recent developments in pure mathematics and in mathematical logic have uncovered a fundamental duality between "existence" and "information." In logic, the duality is between the Boolean logic of subsets and the logic of quotient sets, equivalence relations, or partitions. The analogue to an element of a subset is the notion of a distinction of a partition, and that leads to a whole stream of dualities or analogies--including the development of new logical foundations for information theory parallel to Boole's development of (...) logical finite probability theory. After outlining these dual concepts in mathematical terms, we turn to a more metaphysical speculation about two dual notions of reality, a fully definite notion using Boolean logic and appropriate for classical physics, and the other objectively indefinite notion using partition logic which turns out to be appropriate for quantum mechanics. The existence-information duality is used to intuitively illustrate these two dual notions of reality. The elucidation of the objectively indefinite notion of reality leads to the "killer application" of the existence-information duality, namely the interpretation of quantum mechanics. (shrink)
Nancy MacLean’s book, Democracy in Chains, raised questions about James M. Buchanan’s commitment to democracy. This paper investigates the relationship of classical liberalism in general and of Buchanan in particular to democratic theory. Contrary to the simplistic classical liberal juxtaposition of “coercion vs. consent,” there have been from Antiquity onwards voluntary contractarian defenses of non-democratic government and even slavery—all little noticed by classical liberal scholars who prefer to think of democracy as just “government by the consent of the governed” and (...) slavery as being inherently coercive. Historically, democratic theory had to go beyond that simplistic notion of democracy to develop a critique of consent-based non-democratic government, e.g., the Hobbesian pactum subjectionis. That critique was based firstly on the distinction between contracts or constitutions of alienation (translatio) versus delegation (concessio). Then the contracts of alienation were ruled out based on the theory of inalienable rights that descends from the Reformation doctrine of inalienability of conscience down through the Enlightenment to modern times in the abolitionist and democratic movements. While he developed no theory of inalienability, the mature Buchanan explicitly allowed only a constitution of delegation, contrary to many modern classical liberals or libertarians who consider the choice between consent-based democratic or non-democratic governments (e.g., private cities or shareholder states) to be a pragmatic one. But Buchanan seems to not even realize that his at-most-delegation dictum would also rule out the employer-employee or human rental contract which is a contract of alienation “within the scope of the employment.”. (shrink)
In finite probability theory, events are subsets S⊆U of the outcome set. Subsets can be represented by 1-dimensional column vectors. By extending the representation of events to two dimensional matrices, we can introduce "superposition events." Probabilities are introduced for classical events, superposition events, and their mixtures by using density matrices. Then probabilities for experiments or `measurements' of all these events can be determined in a manner exactly like in quantum mechanics (QM) using density matrices. Moreover the transformation of the density (...) matrices induced by the experiments or `measurements' is the Lüders mixture operation as in QM. And finally by moving the machinery into the n-dimensional vector space over ℤ₂, different basis sets become different outcome sets. That `non-commutative' extension of finite probability theory yields the pedagogical model of quantum mechanics over ℤ₂ that can model many characteristic non-classical results of QM. (shrink)
Double-entry bookkeeping (DEB) implicitly uses a specific mathematical construction, the group of differences using pairs of unsigned numbers ("T-accounts"). That construction was only formulated abstractly in mathematics in the 19th century—even though DEB had been used in the business world for over five centuries. Yet the connection between DEB and the group of differences (here called the "Pacioli group") is still largely unknown both in mathematics and accounting. The precise mathematical treatment of DEB allows clarity on certain conceptual questions and (...) it immediately yields the generalization of the double-entry method to multi-dimensional vectors typically representing the different types of property involved in an enterprise or household. (shrink)
This paper presents an argument for the democratic (or 'labor-managed') firm based on ordinary jurisprudence. The standard principle of responsibility in jurisprudence ('Assign legal responsibility in accordance with de facto responsibility') implies that the people working in a firm should legally appropriate the assets and liabilities produced in the firm (the positive and negative fruits of their labor). This appropriation is normally violated due to the employment or self-rental contract. However, we present an inalienable rights argument that descends from the (...) Reformation and Enlightenment which argues that the self-rental contract, like the self-sale or voluntary slavery contract, is inherently invalid. The key intuition of the inalienable rights theory is that one cannot in fact voluntarily transfer de facto responsibility for one's actions to another person. One can only voluntarily co-operate with another person, but then one is de facto jointly responsible for the results. Just as the legal authorities legally reconstruct the criminous employer and employee as a partnership with shared responsibility, so justice demands that every firm be legally reconstructed as a partnership of all who work (working employers and employees) in the enterprise, i.e., as a democratic firm. (shrink)
The lattice operations of join and meet were defined for set partitions in the nineteenth century, but no new logical operations on partitions were defined and studied during the twentieth century. Yet there is a simple and natural graph-theoretic method presented here to define any n-ary Boolean operation on partitions. An equivalent closure-theoretic method is also defined. In closing, the question is addressed of why it took so long for all Boolean operations to be defined for partitions.
The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...) subsets so there is a dual concept of logical entropy which is the normalized counting measure on distinctions of partitions. Thus the logical notion of information is a measure of distinctions. Classical logical entropy naturally extends to the notion of quantum logical entropy which provides a more natural and informative alternative to the usual Von Neumann entropy in quantum information theory. The quantum logical entropy of a post-measurement density matrix has the simple interpretation as the probability that two independent measurements of the same state using the same observable will have different results. The main result of the paper is that the increase in quantum logical entropy due to a projective measurement of a pure state is the sum of the absolute squares of the off-diagonal entries ("coherences") of the pure state density matrix that are zeroed ("decohered") by the measurement, i.e., the measure of the distinctions ("decoherences") created by the measurement. (shrink)
This article is a review of Erik Olin Wright’s 2010 book Envisioning Real Utopias. The review focuses on certain topics such as his understanding of ‘capitalism,’ his conception of worker cooperatives, and the general issues surrounding markets, the Left, and Marxism.
John Tomasi's new book, Free Market Fairness, has been well-received as "one of the very best philosophical treatments of libertarian thought, ever" and as a "long and friendly conversation between Friedrich Hayek and John Rawls—a conversation which, astonishingly, reaches agreement". The book does present an authoritative state-of-the-debate across the spectrum from right-libertarianism on the one side to high liberalism on the other side. My point is not to question where Tomasi comes down with his own version of "market democracy" as (...) a remix of Hayek and Rawls. My point is to use his sympathetic restatements of views across the liberal spectrum to show the basic misframings and common misunderstandings that cut across the liberal-libertarian viewpoints surveyed in the book. As usual, the heart of the debate is not in the answers to carefully framed questions, but in the framing itself. (shrink)