Suppose several individuals (e.g., experts on a panel) each assign probabilities to some events. How can these individual probability assignments be aggregated into a single collective probability assignment? This article reviews several proposed solutions to this problem. We focus on three salient proposals: linear pooling (the weighted or unweighted linear averaging of probabilities), geometric pooling (the weighted or unweighted geometric averaging of probabilities), and multiplicative pooling (where probabilities are multiplied rather than averaged). We present axiomatic characterisations of each class of (...) pooling functions (most of them classic, but one new) and argue that linear pooling can be justified procedurally, but not epistemically, while the other two pooling methods can be justified epistemically. The choice between them, in turn, depends on whether the individuals' probability assignments are based on shared information or on private information. We conclude by mentioning a number of other pooling methods. (shrink)
Behaviourism is the view that preferences, beliefs, and other mental states in social-scientific theories are nothing but constructs re-describing people's behaviour. Mentalism is the view that they capture real phenomena, on a par with the unobservables in science, such as electrons and electromagnetic fields. While behaviourism has gone out of fashion in psychology, it remains influential in economics, especially in ‘revealed preference’ theory. We defend mentalism in economics, construed as a positive science, and show that it fits best scientific practice. (...) We distinguish mentalism from, and reject, the radical neuroeconomic view that behaviour should be explained in terms of brain processes, as distinct from mental states. (shrink)
We present a new “reason-based” approach to the formal representation of moral theories, drawing on recent decision-theoretic work. We show that any moral theory within a very large class can be represented in terms of two parameters: a specification of which properties of the objects of moral choice matter in any given context, and a specification of how these properties matter. Reason-based representations provide a very general taxonomy of moral theories, as differences among theories can be attributed to differences in (...) their two key parameters. We can thus formalize several distinctions, such as between consequentialist and non-consequentialist theories, between universalist and relativist theories, between agent-neutral and agent-relative theories, between monistic and pluralistic theories, between atomistic and holistic theories, and between theories with a teleological structure and those without. Reason-based representations also shed light on an important but under-appreciated phenomenon: the “underdetermination of moral theory by deontic content”. (shrink)
In response to recent work on the aggregation of individual judgments on logically connected propositions into collective judgments, it is often asked whether judgment aggregation is a special case of Arrowian preference aggregation. We argue for the converse claim. After proving two impossibility theorems on judgment aggregation (using "systematicity" and "independence" conditions, respectively), we construct an embedding of preference aggregation into judgment aggregation and prove Arrow’s theorem (stated for strict preferences) as a corollary of our second result. Although we thereby (...) provide a new proof of Arrow’s theorem, our main aim is to identify the analogue of Arrow’s theorem in judgment aggregation, to clarify the relation between judgment and preference aggregation, and to illustrate the generality of the judgment aggregation model. JEL Classi…cation: D70, D71.. (shrink)
There is a surprising disconnect between formal rational choice theory and philosophical work on reasons. The one is silent on the role of reasons in rational choices, the other rarely engages with the formal models of decision problems used by social scientists. To bridge this gap, we propose a new, reason-based theory of rational choice. At its core is an account of preference formation, according to which an agent’s preferences are determined by his or her motivating reasons, together with a (...) ‘weighing relation’ between different combinations of reasons. By explaining how someone’s preferences may vary with changes in his or her motivating reasons, our theory illuminates the relationship between deliberation about reasons and rational choices. Although primarily positive, the theory can also help us think about how those preferences and choices ought to respond to normative reasons. (shrink)
While ordinary decision theory focuses on empirical uncertainty, real decision-makers also face normative uncertainty: uncertainty about value itself. From a purely formal perspective, normative uncertainty is comparable to (Harsanyian or Rawlsian) identity uncertainty in the 'original position', where one's future values are unknown. A comprehensive decision theory must address twofold uncertainty -- normative and empirical. We present a simple model of twofold uncertainty, and show that the most popular decision principle -- maximising expected value (`Expectationalism') -- has different formulations, namely (...) Ex-Ante Expectationalism, Ex-Post Expectationalism, and hybrid theories. These alternative theories recommend different decisions, reasoning modes, and attitudes to risk. But they converge under an interesting (necessary and sufficient) condition. (shrink)
We introduce a “reason-based” framework for explaining and predicting individual choices. It captures the idea that a decision-maker focuses on some but not all properties of the options and chooses an option whose motivationally salient properties he/she most prefers. Reason-based explanations allow us to distinguish between two kinds of context-dependent choice: the motivationally salient properties may (i) vary across choice contexts, and (ii) include not only “intrinsic” properties of the options, but also “context-related” properties. Our framework can accommodate boundedly rational (...) and sophisticatedly rational choice. Since properties can be recombined in new ways, it also offers resources for predicting choices in unobserved contexts. (shrink)
The aggregation of individual judgments over interrelated propositions is a newly arising field of social choice theory. I introduce several independence conditions on judgment aggregation rules, each of which protects against a specific type of manipulation by agenda setters or voters. I derive impossibility theorems whereby these independence conditions are incompatible with certain minimal requirements. Unlike earlier impossibility results, the main result here holds for any (non-trivial) agenda. However, independence conditions arguably undermine the logical structure of judgment aggregation. I therefore (...) suggest restricting independence to premises, which leads to a generalised premise-based procedure. This procedure is proven to be possible if the premises are logically independent. (shrink)
The contemporary theory of epistemic democracy often draws on the Condorcet Jury Theorem to formally justify the ‘wisdom of crowds’. But this theorem is inapplicable in its current form, since one of its premises – voter independence – is notoriously violated. This premise carries responsibility for the theorem's misleading conclusion that ‘large crowds are infallible’. We prove a more useful jury theorem: under defensible premises, ‘large crowds are fallible but better than small groups’. This theorem rehabilitates the importance of deliberation (...) and education, which appear inessential in the classical jury framework. Our theorem is related to Ladha's (1993) seminal jury theorem for interchangeable (‘indistinguishable’) voters based on de Finetti's Theorem. We also prove a more general and simpler such jury theorem. (shrink)
The new field of judgment aggregation aims to merge many individual sets of judgments on logically interconnected propositions into a single collective set of judgments on these propositions. Judgment aggregation has commonly been studied using classical propositional logic, with a limited expressive power and a problematic representation of conditional statements ("if P then Q") as material conditionals. In this methodological paper, I present a simple unified model of judgment aggregation in general logics. I show how many realistic decision problems can (...) be represented in it. This includes decision problems expressed in languages of classical propositional logic, predicate logic (e.g. preference aggregation problems), modal or conditional logics, and some multi-valued or fuzzy logics. I provide a list of simple tools for working with general logics, and I prove impossibility results that generalise earlier theorems. (shrink)
Which rules for aggregating judgments on logically connected propositions are manipulable and which not? In this paper, we introduce a preference-free concept of non-manipulability and contrast it with a preference-theoretic concept of strategy-proofness. We characterize all non-manipulable and all strategy-proof judgment aggregation rules and prove an impossibility theorem similar to the Gibbard--Satterthwaite theorem. We also discuss weaker forms of non-manipulability and strategy-proofness. Comparing two frequently discussed aggregation rules, we show that “conclusion-based voting” is less vulnerable to manipulation than “premise-based voting”, (...) which is strategy-proof only for “reason-oriented” individuals. Surprisingly, for “outcome-oriented” individuals, the two rules are strategically equivalent, generating identical judgments in equilibrium. Our results introduce game-theoretic considerations into judgment aggregation and have implications for debates on deliberative democracy. (shrink)
Agents are often assumed to have degrees of belief (“credences”) and also binary beliefs (“beliefs simpliciter”). How are these related to each other? A much-discussed answer asserts that it is rational to believe a proposition if and only if one has a high enough degree of belief in it. But this answer runs into the “lottery paradox”: the set of believed propositions may violate the key rationality conditions of consistency and deductive closure. In earlier work, we showed that this problem (...) generalizes: there exists no local function from degrees of belief to binary beliefs that satisfies some minimal conditions of rationality and non-triviality. “Locality” means that the binary belief in each proposition depends only on the degree of belief in that proposition, not on the degrees of belief in others. One might think that the impossibility can be avoided by dropping the assumption that binary beliefs are a function of degrees of belief. We prove that, even if we drop the “functionality” restriction, there still exists no local relation between degrees of belief and binary beliefs that satisfies some minimal conditions. Thus functionality is not the source of the impossibility; its source is the condition of locality. If there is any non-trivial relation between degrees of belief and binary beliefs at all, it must be a “holistic” one. We explore several concrete forms this “holistic” relation could take. (shrink)
John Broome has developed an account of rationality and reasoning which gives philosophical foundations for choice theory and the psychology of rational agents. We formalize his account into a model that differs from ordinary choice-theoretic models through focusing on psychology and the reasoning process. Within that model, we ask Broome’s central question of whether reasoning can make us more rational: whether it allows us to acquire transitive preferences, consistent beliefs, non-akratic intentions, and so on. We identify three structural types of (...) rationality requirements: consistency requirements, completeness requirements, and closedness requirements. Many standard rationality requirements fall under this typology. Based on three theorems, we argue that reasoning is successful in achieving closedness requirements, but not in achieving consistency or completeness requirements. We assess how far our negative results reveal gaps in Broome's theory, or deficiencies in choice theory and behavioral economics. (shrink)
How can different individuals' probability assignments to some events be aggregated into a collective probability assignment? Classic results on this problem assume that the set of relevant events -- the agenda -- is a sigma-algebra and is thus closed under disjunction (union) and conjunction (intersection). We drop this demanding assumption and explore probabilistic opinion pooling on general agendas. One might be interested in the probability of rain and that of an interest-rate increase, but not in the probability of rain or (...) an interest-rate increase. We characterize linear pooling and neutral pooling for general agendas, with classic results as special cases for agendas that are sigma-algebras. As an illustrative application, we also consider probabilistic preference aggregation. Finally, we compare our results with existing results on binary judgment aggregation and Arrovian preference aggregation. This paper is the first of two self-contained, but technically related companion papers inspired by binary judgment-aggregation theory. (shrink)
Democratic decision-making is often defended on grounds of the ‘wisdom of crowds’: decisions are more likely to be correct if they are based on many independent opinions, so a typical argument in social epistemology. But what does it mean to have independent opinions? Opinions can be probabilistically dependent even if individuals form their opinion in causal isolation from each other. We distinguish four probabilistic notions of opinion independence. Which of them holds depends on how individuals are causally affected by environmental (...) factors such as commonly perceived evidence. In a general theorem, we identify causal conditions guaranteeing each kind of opinion independence. These results have implications for whether and how ‘wisdom of crowds’ arguments are possible, and how truth-conducive institutions can be designed. (shrink)
Under the independence and competence assumptions of Condorcet’s classical jury model, the probability of a correct majority decision converges to certainty as the jury size increases, a seemingly unrealistic result. Using Bayesian networks, we argue that the model’s independence assumption requires that the state of the world (guilty or not guilty) is the latest common cause of all jurors’ votes. But often – arguably in all courtroom cases and in many expert panels – the latest such common cause is a (...) shared ‘body of evidence’ observed by the jurors. In the corresponding Bayesian network, the votes are direct descendants not of the state of the world, but of the body of evidence, which in turn is a direct descendant of the state of the world. We develop a model of jury decisions based on this Bayesian network. Our model permits the possibility of misleading evidence, even for a maximally competent observer, which cannot easily be accommodated in the classical model. We prove that (i) the probability of a correct majority verdict converges to the probability that the body of evidence is not misleading, a value typically below 1; (ii) depending on the required threshold of ‘no reasonable doubt’, it may be impossible, even in an arbitrarily large jury, to establish guilt of a defendant ‘beyond any reasonable doubt’. (shrink)
We present a general framework for representing belief-revision rules and use it to characterize Bayes's rule as a classical example and Jeffrey's rule as a non-classical one. In Jeffrey's rule, the input to a belief revision is not simply the information that some event has occurred, as in Bayes's rule, but a new assignment of probabilities to some events. Despite their differences, Bayes's and Jeffrey's rules can be characterized in terms of the same axioms: "responsiveness", which requires that revised beliefs (...) incorporate what has been learnt, and "conservativeness", which requires that beliefs on which the learnt input is "silent" do not change. To illustrate the use of non-Bayesian belief revision in economic theory, we sketch a simple decision-theoretic application. (shrink)
Several recent results on the aggregation of judgments over logically connected propositions show that, under certain conditions, dictatorships are the only propositionwise aggregation functions generating fully rational (i.e., complete and consistent) collective judgments. A frequently mentioned route to avoid dictatorships is to allow incomplete collective judgments. We show that this route does not lead very far: we obtain oligarchies rather than dictatorships if instead of full rationality we merely require that collective judgments be deductively closed, arguably a minimal condition of (...) rationality, compatible even with empty judgment sets. We derive several characterizations of oligarchies and provide illustrative applications to Arrowian preference aggregation and Kasher and Rubinsteinís group identification problem. (shrink)
A group is often construed as one agent with its own probabilistic beliefs (credences), which are obtained by aggregating those of the individuals, for instance through averaging. In their celebrated “Groupthink”, Russell et al. (2015) require group credences to undergo Bayesian revision whenever new information is learnt, i.e., whenever individual credences undergo Bayesian revision based on this information. To obtain a fully Bayesian group, one should often extend this requirement to non-public or even private information (learnt by not all or (...) just one individual), or to non-representable information (not representable by any event in the domain where credences are held). I pro- pose a taxonomy of six types of ‘group Bayesianism’. They differ in the information for which Bayesian revision of group credences is required: public representable information, private representable information, public non-representable information, etc. Six corre- sponding theorems establish how individual credences must (not) be aggregated to ensure group Bayesianism of any type, respectively. Aggregating through standard averaging is never permitted; instead, different forms of geometric averaging must be used. One theorem—that for public representable information—is essentially Russell et al.’s central result (with minor corrections). Another theorem—that for public non-representable information—fills a gap in the theory of externally Bayesian opinion pooling. (shrink)
I propose a relevance-based independence axiom on how to aggregate individual yes/no judgments on given propositions into collective judgments: the collective judgment on a proposition depends only on people’s judgments on propositions which are relevant to that proposition. This axiom contrasts with the classical independence axiom: the collective judgment on a proposition depends only on people’s judgments on the same proposition. I generalize the premise-based rule and the sequential-priority rule to an arbitrary priority order of the propositions, instead of a (...) dichotomous premise/conclusion order resp. a linear priority order. I prove four impossibility theorems on relevance-based aggregation. One theorem simultaneously generalizes Arrow’s Theorem (in its general and indifference-free versions) and the well-known Arrow-like theorem in judgment aggregation. (shrink)
In the emerging literature on judgment aggregation over logically connected proposi- tions, expert rights or liberal rights have not been investigated yet. A group making collective judgments may assign individual members or subgroups with expert know- ledge on, or particularly affected by, certain propositions the right to determine the collective judgment on those propositions. We identify a problem that generalizes Sen's 'liberal paradox'. Under plausible conditions, the assignment of rights to two or more individuals or subgroups is inconsistent with the (...) unanimity principle, whereby unanimously accepted propositions are collectively accepted. The inconsistency can be avoided if individual judgments or rights satisfy special conditions. (shrink)
The widely discussed "discursive dilemma" shows that majority voting in a group of individuals on logically connected propositions may produce irrational collective judgments. We generalize majority voting by considering quota rules, which accept each proposition if and only if the number of individuals accepting it exceeds a given threshold, where different thresholds may be used for different propositions. After characterizing quota rules, we prove necessary and sufficient conditions on the required thresholds for various collective rationality requirements. We also consider sequential (...) quota rules, which ensure collective rationality by adjudicating propositions sequentially and letting earlier judgments constrain later ones. Sequential rules may be path-dependent and strategically manipulable. We characterize path-independence and prove its essential equivalence to strategy-proofness. Our results shed light on the rationality of simple-, super-, and sub-majoritarian decision-making. (shrink)
How can the propositional attitudes of several individuals be aggregated into overall collective propositional attitudes? Although there are large bodies of work on the aggregation of various special kinds of propositional attitudes, such as preferences, judgments, probabilities and utilities, the aggregation of propositional attitudes is seldom studied in full generality. In this paper, we seek to contribute to filling this gap in the literature. We sketch the ingredients of a general theory of propositional attitude aggregation and prove two new theorems. (...) Our first theorem simultaneously characterizes some prominent aggregation rules in the cases of probability, judgment and preference aggregation, including linear opinion pooling and Arrovian dictatorships. Our second theorem abstracts even further from the specific kinds of attitudes in question and describes the properties of a large class of aggregation rules applicable to a variety of belief-like attitudes. Our approach integrates some previously disconnected areas of investigation. (shrink)
Rational choice theory analyzes how an agent can rationally act, given his or her preferences, but says little about where those preferences come from. Preferences are usually assumed to be fixed and exogenously given. Building on related work on reasons and rational choice, we describe a framework for conceptualizing preference formation and preference change. In our model, an agent's preferences are based on certain "motivationally salient" properties of the alternatives over which the preferences are held. Preferences may change as new (...) properties of the alternatives become salient or previously salient properties cease to be salient. Our approach captures endogenous preferences in various contexts and helps to illuminate the distinction between formal and substantive concepts of rationality, as well as the role of perception in rational choice. (shrink)
In the framework of judgment aggregation, we assume that some formulas of the agenda are singled out as premisses, and that both Independence (formula-wise aggregation) and Unanimity Preservation hold for them. Whether premiss-based aggregation thus defined is compatible with conclusion-based aggregation, as defined by Unanimity Preservation on the non-premisses, depends on how the premisses are logically connected, both among themselves and with other formulas. We state necessary and sufficient conditions under which the combination of both approaches leads to dictatorship (resp. (...) oligarchy), either just on the premisses or on the whole agenda. This framework is inspired by the doctrinal paradox of legal theory and arguably relevant to this field as well as political science and political economy. When the set of premisses coincides with the whole agenda, a limiting case of our assumptions, we obtain several existing results in judgment aggregation theory. (shrink)
We present an abstract model of rationality that focuses on structural properties of attitudes. Rationality requires coherence between your attitudes, such as your beliefs, values, and intentions. We define three 'logical' conditions on attitudes: consistency, completeness, and closedness. They parallel the familiar logical conditions on beliefs, but contrast with standard rationality conditions like preference transitivity. We establish a formal correspondence between our logical conditions and standard rationality conditions. Addressing John Broome's programme 'rationality through reasoning', we formally characterize how you can (...) (not) become more logical by reasoning. Our analysis connects rationality with logic, and enables logical talk about multi-attitude psychology. (shrink)
In judgment aggregation, unlike preference aggregation, not much is known about domain restrictions that guarantee consistent majority outcomes. We introduce several conditions on individual judgments su¢ - cient for consistent majority judgments. Some are based on global orders of propositions or individuals, others on local orders, still others not on orders at all. Some generalize classic social-choice-theoretic domain conditions, others have no counterpart. Our most general condition generalizes Sen’s triplewise value-restriction, itself the most general classic condition. We also prove a (...) new characterization theorem: for a large class of domains, if there exists any aggregation function satisfying some democratic conditions, then majority voting is the unique such function. Taken together, our results provide new support for the robustness of majority rule. (shrink)
In this paper, we identify a new and mathematically well-defined sense in which the coherence of a set of hypotheses can be truth-conducive. Our focus is not, as usual, on the probability but on the confirmation of a coherent set and its members. We show that, if evidence confirms a hypothesis, confirmation is “transmitted” to any hypotheses that are sufficiently coherent with the former hypothesis, according to some appropriate probabilistic coherence measure such as Olsson’s or Fitelson’s measure. Our findings have (...) implications for scientific methodology, as they provide a formal rationale for the method of indirect confirmation and the method of confirming theories by confirming their parts. (shrink)
If a group is modelled as a single Bayesian agent, what should its beliefs be? I propose an axiomatic model that connects group beliefs to beliefs of group members, who are themselves modelled as Bayesian agents, possibly with different priors and different information. Group beliefs are proven to take a simple multiplicative form if people’s information is independent, and a more complex form if information overlaps arbitrarily. This shows that group beliefs can incorporate all information spread over the individuals without (...) the individuals having to communicate their (possibly complex and hard-to-describe) private information; communicating prior and posterior beliefs sufices. JEL classification: D70, D71.. (shrink)
Decision-making typically requires judgments about causal relations: we need to know the causal effects of our actions and the causal relevance of various environmental factors. We investigate how several individuals' causal judgments can be aggregated into collective causal judgments. First, we consider the aggregation of causal judgments via the aggregation of probabilistic judgments, and identify the limitations of this approach. We then explore the possibility of aggregating causal judgments independently of probabilistic ones. Formally, we introduce the problem of causal-network aggregation. (...) Finally, we revisit the aggregation of probabilistic judgments when this is constrained by prior aggregation of qualitative causal judgments. (shrink)
Jury theorems are mathematical theorems about the ability of collectives to make correct decisions. Several jury theorems carry the optimistic message that, in suitable circumstances, ‘crowds are wise’: many individuals together (using, for instance, majority voting) tend to make good decisions, outperforming fewer or just one individual. Jury theorems form the technical core of epistemic arguments for democracy, and provide probabilistic tools for reasoning about the epistemic quality of collective decisions. The popularity of jury theorems spans across various disciplines such (...) as economics, political science, philosophy, and computer science. This entry reviews and critically assesses a variety of jury theorems. It first discusses Condorcet's initial jury theorem, and then progressively introduces jury theorems with more appropriate premises and conclusions. It explains the philosophical foundations, and relates jury theorems to diversity, deliberation, shared evidence, shared perspectives, and other phenomena. It finally connects jury theorems to their historical background and to democratic theory, social epistemology, and social choice theory. (shrink)
What is the relationship between degrees of belief and binary beliefs? Can the latter be expressed as a function of the former—a so-called “belief-binarization rule”—without running into difficulties such as the lottery paradox? We show that this problem can be usefully analyzed from the perspective of judgment-aggregation theory. Although some formal similarities between belief binarization and judgment aggregation have been noted before, the connection between the two problems has not yet been studied in full generality. In this paper, we seek (...) to fill this gap. The paper is organized around a baseline impossibility theorem, which we use to map out the space of possible solutions to the belief-binarization problem. Our theorem shows that, except in limiting cases, there exists no belief-binarization rule satisfying four initially plausible desiderata. Surprisingly, this result is a direct corollary of the judgment-aggregation variant of Arrow’s classic impossibility theorem in social choice theory. (shrink)
In the theory of judgment aggregation, it is known for which agendas of propositions it is possible to aggregate individual judgments into collective ones in accordance with the Arrow-inspired requirements of universal domain, collective rationality, unanimity preservation, non-dictatorship and propositionwise independence. But it is only partially known (e.g., only in the monotonic case) for which agendas it is possible to respect additional requirements, notably non-oligarchy, anonymity, no individual veto power, or implication preservation. We fully characterize the agendas for which there (...) are such possibilities, thereby answering the most salient open questions about propositionwise judgment aggregation. Our results build on earlier results by Nehring and Puppe (2002), Nehring (2006), Dietrich and List (2007a) and Dokow and Holzman (2010a). (shrink)
Condorcet's famous jury theorem reaches an optimistic conclusion on the correctness of majority decisions, based on two controversial premises about voters: they are competent and vote independently, in a technical sense. I carefully analyse these premises and show that: whether a premise is justi…ed depends on the notion of probability considered; none of the notions renders both premises simultaneously justi…ed. Under the perhaps most interesting notions, the independence assumption should be weakened.
The new …eld of judgment aggregation aims to …nd collective judgments on logically interconnected propositions. Recent impossibility results establish limitations on the possibility to vote independently on the propositions. I show that, fortunately, the impossibility results do not apply to a wide class of realistic agendas once propositions like “if a then b” are adequately modelled, namely as subjunctive implications rather than material implications. For these agendas, consistent and complete collective judgments can be reached through appropriate quota rules (which decide (...) propositions using acceptance thresholds). I characterise the class of these quota rules. I also prove an abstract result that characterises consistent aggregation for arbitrary agendas in a general logic. (shrink)
People reason not only in beliefs, but also in intentions, preferences, and other attitudes. They form preferences from existing preferences, or intentions from existing beliefs and intentions, and so on. This often involves choosing between rival conclusions. Building on Broome (Rationality through reasoning, Hoboken, Wiley. https://doi.org/10.1002/9781118609088, 2013) and Dietrich et al. (J Philos 116:585–614. https://doi.org/10.5840/jphil20191161138, 2019), we present a philosophical and formal analysis of reasoning in attitudes, with or without facing choices in reasoning. We give different accounts of choosing, in (...) terms of a conscious activity or a partly subconscious process. Reasoning in attitudes differs fundamentally from reasoning _about_ attitudes, a form of theoretical reasoning in which one discovers rather than forms attitudes. We show that reasoning in attitudes has standard formal properties (such as monotonicity), but is indeterministic, reflecting choice in reasoning. Like theoretical reasoning, it need not follow logical entailment, but for a more radical reason, namely indeterminism. This makes reasoning in attitudes harder to model logically than theoretical reasoning. But it can be studied abstractly, using indeterministic consequence operators. (shrink)
Standard impossibility theorems on judgment aggregation over logically connected propositions either use a controversial systematicity condition or apply only to agendas of propositions with rich logical connections. Are there any serious impossibilities without these restrictions? We prove an impossibility theorem without requiring systematicity that applies to most standard agendas: Every judgment aggregation function (with rational inputs and outputs) satisfying a condition called unbiasedness is dictatorial (or effectively dictatorial if we remove one of the agenda conditions). Our agenda conditions are tight. (...) When applied illustratively to (strict) preference aggregation represented in our model, the result implies that every unbiased social welfare function with universal domain is effectively dictatorial. (shrink)
All existing impossibility theorems on judgment aggregation require individual and collective judgment sets to be consistent and complete, arguably a demanding rationality requirement. They do not carry over to aggregation functions mapping profiles of consistent individual judgment sets to consistent collective ones. We prove that, whenever the agenda of propositions under consideration exhibits mild interconnections, any such aggregation function that is "neutral" between the acceptance and rejection of each proposition is dictatorial. We relate this theorem to the literature.
According to standard rational choice theory, as commonly used in political science and economics, an agent's fundamental preferences are exogenously fixed, and any preference change over decision options is due to Bayesian information learning. Although elegant and parsimonious, such a model fails to account for preference change driven by experiences or psychological changes distinct from information learning. We develop a model of non-informational preference change. Alternatives are modelled as points in some multidimensional space, only some of whose dimensions play a (...) role in shaping the agentís preferences. Any change in these "motivationally salient" dimensions can change the agent's preferences. How it does so is described by a new representation theorem. Our model not only captures a wide range of frequently observed phenomena, but also generalizes some standard representations of preferences in political science and economics. (shrink)
Economic models describe individuals in terms of underlying characteristics, such as taste for some good, sympathy level for another player, time discount rate, risk attitude, and so on. In real life, such characteristics change through experiences: taste for Mozart changes through listening to it, sympathy for another player through observing his moves, and so on. Models typically ignore change, not just for simplicity but also because it is unclear how to incorporate change. I introduce a general axiomatic framework for defining, (...) analysing and comparing rival models of change. I show that seemingly basic postulates on modelling change together have strong implications, like irrelevance of the order in which someone has his experiences and ‘linearity’ of change. This is a step towards placing the modelling of change on solid axiomatic grounds and enabling non-arbitrary incorporation of change into economic models. (shrink)
How can different individuals' probability functions on a given sigma-algebra of events be aggregated into a collective probability function? Classic approaches to this problem often require 'event-wise independence': the collective probability for each event should depend only on the individuals' probabilities for that event. In practice, however, some events may be 'basic' and others 'derivative', so that it makes sense first to aggregate the probabilities for the former and then to let these constrain the probabilities for the latter. We formalize (...) this idea by introducing a 'premise-based' approach to probabilistic opinion pooling, and show that, under a variety of assumptions, it leads to linear or neutral opinion pooling on the 'premises'. This paper is the second of two self-contained, but technically related companion papers inspired by binary judgment-aggregation theory. (shrink)
We give a review and critique of jury theorems from a social-epistemology perspective, covering Condorcet’s (1785) classic theorem and several later refinements and departures. We assess the plausibility of the conclusions and premises featuring in jury theorems and evaluate the potential of such theorems to serve as formal arguments for the ‘wisdom of crowds’. In particular, we argue (i) that there is a fundamental tension between voters’ independence and voters’ competence, hence between the two premises of most jury theorems; (ii) (...) that the (asymptotic) conclusion that ‘huge groups are infallible’, reached by many jury theorems, is an artifact of unjustified premises; and (iii) that the (nonasymptotic) conclusion that ‘larger groups are more reliable’, also reached by many jury theorems, is not an artifact and should be regarded as the more adequate formal rendition of the ‘wisdom of crowds’. (shrink)
Assuming that votes are independent, the epistemically optimal procedure in a binary collective choice problem is known to be a weighted supermajority rule with weights given by personal log-likelihood-ratios. It is shown here that an analogous result holds in a much more general model. Firstly, the result follows from a more basic principle than expected-utility maximisation, namely from an axiom (Epistemic Monotonicity) which requires neither utilities nor prior probabilities of the ‘correctness’ of alternatives. Secondly, a person’s input need not be (...) a vote for an alternative, it may be any type of input, for instance a subjective degree of belief or probability of the correctness of one of the alternatives. The case of a profile of subjective degrees of belief is particularly appealing, since here no parameters such as competence parameters need to be known. (shrink)
In solving judgment aggregation problems, groups often face constraints. Many decision problems can be modelled in terms the acceptance or rejection of certain propositions in a language, and constraints as propositions that the decisions should be consistent with. For example, court judgments in breach-of-contract cases should be consistent with the constraint that action and obligation are necessary and sufficient for liability; judgments on how to rank several options in an order of preference with the constraint of transitivity; and judgments on (...) budget items with budgetary constraints. Often more or less demanding constraints on decisions are imaginable. For instance, in preference ranking problems, the transitivity constraint is often contrasted with the weaker acyclicity constraint. In this paper, we make constraints explicit in judgment aggregation by relativizing the rationality conditions of consistency and deductive closure to a constraint set, whose variation yields more or less strong notions of rationality. We review several general results on judgment aggregation in light of such constraints. (shrink)
Does pre-voting group deliberation increase majority competence? To address this question, we develop a probabilistic model of opinion formation and deliberation. Two new jury theorems, one pre-deliberation and one post-deliberation, suggest that deliberation is beneficial. Successful deliberation mitigates three voting failures: (1) overcounting widespread evidence, (2) neglecting evidential inequality, and (3) neglecting evidential complementarity. Simulations and theoretic arguments confirm this. But there are five systematic exceptions where deliberation reduces majority competence, always by increasing failure (1). Our analysis recommends deliberation that (...) is 'participatory', 'even', but possibly 'unequal', i.e., that involves substantive sharing, privileges no evidences, but possibly privileges some persons. (shrink)
This essay discusses the difficulty to reconcile two paradigms about beliefs: the binary or categorical paradigm of yes/no beliefs and the probabilistic paradigm of degrees of belief. The possibility for someone to hold both types of belief simultaneously is challenged by the lottery paradox, and more recently by a general impossibility theorem by Dietrich and List (2018, 2021). The nature, relevance, and implications of the tension are explained and assessed.
There has been much discussion on the two-envelope paradox. Clark and Shackel (2000) have proposed a solution to the paradox, which has been refuted by Meacham and Weisberg (2003). Surprisingly, however, the literature still contains no axiomatic justification for the claim that one should be indifferent between the two envelopes before opening one of them. According to Meacham and Weisberg, "decision theory does not rank swapping against sticking [before opening any envelope]" (p. 686). To fill this gap in the literature, (...) we present a simple axiomatic justification for indifference, avoiding any expectation reasoning, which is often considered problematic in infinite cases. Although the two-envelope paradox assumes an expectation-maximizing agent, we show that analogous paradoxes arise for agents using different decision principles such as maximin and maximax, and that our justification for indifference before opening applies here too. (shrink)
Bayesian epistemology tells us with great precision how we should move from prior to posterior beliefs in light of new evidence or information, but says little about where our prior beliefs come from. It offers few resources to describe some prior beliefs as rational or well-justified, and others as irrational or unreasonable. A different strand of epistemology takes the central epistemological question to be not how to change one’s beliefs in light of new evidence, but what reasons justify a given (...) set of beliefs in the first place. We offer an account of rational belief formation that closes some of the gap between Bayesianism and its reason-based alternative, formalizing the idea that an agent can have reasons for his or her (prior) beliefs, in addition to evidence or information in the ordinary Bayesian sense. Our analysis of reasons for belief is part of a larger programme of research on the role of reasons in rational agency (Dietrich and List, Nous, 2012a, in press; Int J Game Theory, 2012b, in press). (shrink)
Tough anti-terrorism policies are often defended by focusing on a fixed minority of the population who prefer violent outcomes, and arguing that toughness reduces the risk of terrorism from this group. This reasoning implicitly assumes that tough policies do not increase the group of 'potential terrorists', i.e., of people with violent preferences. Preferences and their level of violence are treated as stable, exogenously fixed features. To avoid this unrealis- tic assumption, I formulate a model in which policies can 'brutalise' or (...) 'appease' someone's personality, i.e., his preferences. This follows the endogenous prefer- ences approach, popular elsewhere in political science and economics. I formally decompose the effect of toughness into a (desirable) deterrence effect and an (un- desirable) provocation effect. Whether toughness is overall effi cient depends on which effect overweighs. I show that neglecting provocation typically leads to toughness exaggeration. This suggests that some tough anti-terrorism policies observable in the present and past can be explained by a neglect of provocation. (shrink)
Judgment-aggregation theory has always focused on the attainment of rational collective judgments. But so far, rationality has been understood in static terms: as coherence of judgments at a given time, defined as consistency, completeness, and/or deductive closure. This paper asks whether collective judgments can be dynamically rational, so that they change rationally in response to new information. Formally, a judgment aggregation rule is dynamically rational with respect to a given revision operator if, whenever all individuals revise their judgments in light (...) of some information (a learnt proposition), then the new aggregate judgments are the old ones revised in light of this information, i.e., aggregation and revision commute. We prove an impossibility theorem: if the propositions on the agenda are non-trivially connected, no judgment aggregation rule with standard properties is dynamically rational with respect to any revision operator satisfying some basic conditions on revision. Our theorem is the dynamic-rationality counterpart of some well-known impossibility theorems for static rationality. We also explore how dynamic rationality might be achieved by relaxing some of the conditions on the aggregation rule and/or the revision operator. Notably, premise-based aggregation rules are dynamically rational with respect to so-called premise-based revision operators. (shrink)