The need for quantitative measurement represents a unifying bond that links all the physical, biological, and social sciences. Measurements of such disparate phenomena as subatomic masses, uncertainty, information, and human values share common features whose explication is central to the achievement of foundational work in any particular mathematical science as well as for the development of a coherent philosophy of science. This book presents a theory of measurement, one that is "abstract" in that it is concerned with highly general axiomatizations (...) of empirical and qualitative settings and how these can be represented quantitatively. It was inspired by, and represents a generalization and extension of, the last major research work in this field, Foundations of Measurement Vol. I, by Krantz, Luce, Suppes, and Tversky published in 1971. (shrink)
Utilitarianism is one of the most famous ethical doctrines, based on the ideal of maximizing pleasure and minimizing pain. But Utilitarians and their opponents lack a clear scientific and philosophical understanding of its foundations, the measurement and aggregation of utility. This is what The Pursuit of Happiness now offers.
We revisit classical Utilitarianism by connecting and generalizing two ideas. The first is that there is a representation theorem possible for hedonic value similar to, but also importantly different from, the one provided by von Neumann and Morgenstern to measure decision utility. The idea is to use objective time, in place of objective chance, to measure hedonic value. This representation for hedonic value delivers a stronger kind of scale than von Neumann–Morgenstern utility, a ratio scale rather than merely an interval (...) scale. The second idea is that measurement on a ratio scale allows the meaningful aggregation of utilities over a group. This is aggregation by product rather than sum. Aggregation by product is known to have interesting Prioritarian consequences. Aggregation becomes complicated when the two approaches are mixed, when hedonic value is mixed with uncertainly. It becomes problematic when pain as well as pleasure is taken into account. (shrink)
Axiomatizations of measurement systems usually require an axiom--called an Archimedean axiom--that allows quantities to be compared. This type of axiom has a different form from the other measurement axioms, and cannot--except in the most trivial cases--be empirically verified. In this paper, representation theorems for extensive measurement structures without Archimedean axioms are given. Such structures are represented in measurement spaces that are generalizations of the real number system. Furthermore, a precise description of "Archimedean axioms" is given and it is shown that (...) in all interesting cases "Archimedean axioms" are independent of other measurement axioms. (shrink)
The evolution of color categorization systems is investigated by simulating categorization games played by a population of artificial agents. The constraints placed on individual agent’s perception and cognition are minimal and involve limited color discriminability and learning through reinforcement. The main dynamic mechanism for population evolution is pragmatic in nature: There is a pragmatic need for communication between agents, and if the results of such communications have positive consequences in their shared world then the agents involved are positively rewarded, whereas (...) if the results have negative consequences, then involved agents are punished. A mechanism for changing the composition of the population due to agents’ birth and death is also investigated. This birth-death mechanism is found to effectively move an established population color naming system toward a theoretically more optimal one. The simulation results of this article provide insights regarding mechanisms that may constrain universal tendencies in human color categorization systems observed in the linguistic and anthropological literatures. (shrink)
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological (...) probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. (shrink)
In interactive situations, agents can “learn” something that is not a preexisting truth. They can converge to an arbitrary convention, or tacit agreement. Once established they may even view it as an objective truth. Here we investigate accommodation dynamics for interpersonal comparisons of utility intervals. We show, for a large class of dynamics, convergence to a convention.
In the literature, there are many axiomatizations of qualitative probability. They all suffer certain defects: either they are too nonspecific and allow nonunique quantitative interpretations or are overspecific and rule out cases with unique quantitative interpretations. In this paper, it is shown that the class of qualitative probability structures with nonunique quantitative interpretations is not first order axiomatizable and that the class of qualitative probability structures with a unique quantitative interpretation is not a finite, first order extension of the theory (...) of qualitative probability. The idea behind the method of proof is quite general and can be used in other measurement situations. (shrink)
This paper takes a critical look at theory-free, statistical methodologies for processing and interpreting data taken from respondents answering a set of dichotomous (yes-no) questions. The basic issue concerns to what extent theoretical conclusions based on such analyses are invariant under a class of "informationally equivalent" question transformations. First the notion of Boolean equivalence of two question sets is discussed. Then Lazarsfeld's latent structure analysis is considered in detail. It is discovered that the best fitting latent model depends on which (...) one of the many informationally equivalent question sets is used. This fact raises a number of methodological problems and pitfalls with latent structure analysis. Related problems with other methodologies are briefly discussed. (shrink)
A probability function on an algebra of events is assumed. Some of the events are scientific refutations in the sense that the assumption of their occurrence leads to a contradiction. It is shown that the scientific refutations form a a boolean sublattice in terms of the subset ordering. In general, the restriction of to the sublattice is not a probability function on the sublattice. It does, however, have many interesting properties. In particular, (i) it captures probabilistic ideas inherent in some (...) legal procedures; and (ii) it is used to argue against the commonly held view that behavioral violations of certain basic conditions for qualitative probability are indicative of irrationality. Also discussed are (iii) the relationship between the formal development of scientific refutations presented here and intuitionistic logic, and (iv) an interpretation of a belief function used in the behavioral sciences to explain empirical results about subjective, probabilistic estimation, including the Ellsberg paradox. (shrink)