Much of our understanding of human thinking is based on probabilistic models. This innovative book by Jerome R. Busemeyer and Peter D. Bruza argues that, actually, the underlying mathematical structures from quantum theory provide a much better account of human thinking than traditional models. They introduce the foundations for modelling probabilistic-dynamic systems using two aspects of quantum theory. The first, 'contextuality', is a way to understand interference effects found with inferences and decisions under conditions of uncertainty. The second, 'quantum entanglement', (...) allows cognitive phenomena to be modeled in non-reductionist ways. Employing these principles drawn from quantum theory allows us to view human cognition and decision in a totally new light. Introducing the basic principles in an easy-to-follow way, this book does not assume a physics background or a quantum brain and comes complete with a tutorial and fully worked-out applications in important areas of cognition and decision. (shrink)
Classical (Bayesian) probability (CP) theory has led to an influential research tradition for modeling cognitive processes. Cognitive scientists have been trained to work with CP principles for so long that it is hard even to imagine alternative ways to formalize probabilities. However, in physics, quantum probability (QP) theory has been the dominant probabilistic approach for nearly 100 years. Could QP theory provide us with any advantages in cognitive modeling as well? Note first that both CP and QP theory share the (...) fundamental assumption that it is possible to model cognition on the basis of formal, probabilistic principles. But why consider a QP approach? The answers are that (1) there are many well-established empirical findings (e.g., from the influential Tversky, Kahneman research tradition) that are hard to reconcile with CP principles; and (2) these same findings have natural and straightforward explanations with quantum principles. In QP theory, probabilistic assessment is often strongly context- and order-dependent, individual states can be superposition states (that are impossible to associate with specific values), and composite systems can be entangled (they cannot be decomposed into their subsystems). All these characteristics appear perplexing from a classical perspective. However, our thesis is that they provide a more accurate and powerful account of certain cognitive processes. We first introduce QP theory and illustrate its application with psychological examples. We then review empirical findings that motivate the use of quantum theory in cognitive theory, but also discuss ways in which QP and CP theories converge. Finally, we consider the implications of a QP theory approach to cognition for human rationality. (shrink)
Quantum cognition research applies abstract, mathematical principles of quantum theory to inquiries in cognitive science. It differs fundamentally from alternative speculations about quantum brain processes. This topic presents new developments within this research program. In the introduction to this topic, we try to answer three questions: Why apply quantum concepts to human cognition? How is quantum cognitive modeling different from traditional cognitive modeling? What cognitive processes have been modeled using a quantum account? In addition, a brief introduction to quantum probability (...) theory and a concrete example is provided to illustrate how a quantum cognitive model can be developed to explain paradoxical empirical findings in psychological literature. (shrink)
Question order effects are commonly observed in self-report measures of judgment and attitude. This article develops a quantum question order model (the QQ model) to account for four types of question order effects observed in literature. First, the postulates of the QQ model are presented. Second, an a priori, parameter-free, and precise prediction, called the QQ equality, is derived from these mathematical principles, and six empirical data sets are used to test the prediction. Third, a new index is derived from (...) the model to measure similarity between questions. Fourth, we show that in contrast to the QQ model, Bayesian and Markov models do not generally satisfy the QQ equality and thus cannot account for the reported empirical data that support this equality. Finally, we describe the conditions under which order effects are predicted to occur, and we review a broader range of findings that are encompassed by these very same quantum principles. We conclude that quantum probability theory, initially invented to explain order effects on measurements in physics, appears to be a powerful natural explanation for order effects of self-report measures in social and behavioral sciences, too. (shrink)
Order of information plays a crucial role in the process of updating beliefs across time. In fact, the presence of order effects makes a classical or Bayesian approach to inference difficult. As a result, the existing models of inference, such as the belief-adjustment model, merely provide an ad hoc explanation for these effects. We postulate a quantum inference model for order effects based on the axiomatic principles of quantum probability theory. The quantum inference model explains order effects by transforming a (...) state vector with different sequences of operators for different orderings of information. We demonstrate this process by fitting the quantum model to data collected in a medical diagnostic task and a jury decision-making task. To further test the quantum inference model, a new jury decision-making experiment is developed. Using the results of this experiment, we compare the quantum inference model with two versions of the belief-adjustment model, the adding model and the averaging model. We show that both the quantum model and the adding model provide good fits to the data. To distinguish the quantum model from the adding model, we develop a new experiment involving extreme evidence. The results from this new experiment suggest that the adding model faces limitations when accounting for tasks involving extreme evidence, whereas the quantum inference model does not. Ultimately, we argue that the quantum model provides a more coherent account for order effects that was not possible before. (shrink)
It is a hallmark of a good model to make accurate a priori predictions to new conditions (Busemeyer & Wang, 2000). This study compared 8 decision learning models with respect to their generalizability. Participants performed 2 tasks (the Iowa Gambling Task and the Soochow Gambling Task), and each model made a priori predictions by estimating the parameters for each participant from 1 task and using those same parameters to predict on the other task. Three methods were used to evaluate the (...) models at the individual level of analysis. The first method used a post hoc fit criterion, the second method used a generalization criterion for short‐term predictions, and the third method again used a generalization criterion for long‐term predictions. The results suggest that the models with the prospect utility function can make generalizable predictions to new conditions, and different learning models are needed for making short‐versus long‐term predictions on simple gambling tasks. (shrink)
When constrained by limited resources, how do we choose axioms of rationality? The target article relies on Bayesian reasoning that encounter serioustractabilityproblems. We propose another axiomatic foundation: quantum probability theory, which provides for less complex and more comprehensive descriptions. More generally, defining rationality in terms of axiomatic systems misses a key issue: rationality must be defined by humans facing vague information.
The Iowa Gambling Task (IGT) and the Soochow Gambling Task (SGT) are two experience-based risky decision-making tasks for examining decision-making deficits in clinical populations. Several cognitive models, including the expectancy-valence learning model (EVL) and the prospect valence learning model (PVL), have been developed to disentangle the motivational, cognitive, and response processes underlying the explicit choices in these tasks. The purpose of the current study was to develop an improved model that can fit empirical data better than the EVL and PVL (...) models and, in addition, produce more consistent parameter estimates across the IGT and SGT. Twenty-six opiate users (mean age 34.23; SD 8.79) and 27 control participants (mean age 35; SD 10.44) completed both tasks. Eighteen cognitive models varying in evaluation, updating, and choice rules were fit to individual data and their performances were compared to that of a statistical baseline model to find a best fitting model. The results showed that the model combining the prospect utility function treating gains and losses separately, the decay-reinforcement updating rule, and the trial-independent choice rule performed the best in both tasks. Furthermore, the winning model produced more consistent individual parameter estimates across the two tasks than any of the other models. (shrink)
Quantum cognition is an emerging field that uses mathematical principles of quantum theory to help formalize and understand cognitive systems and processes. The topic on the potential of using quantum theory to build models of cognition (Volume 5, issue 4) introduces and synthesizes its new development through an introduction and six core articles. The current issue presents 14 commentaries on the core articles. Five key issues surface, some of which are interestingly controversial and debatable as expected for a new emerging (...) field. (shrink)
The attempt to employ quantum principles for modeling cognition has enabled the introduction of several new concepts in psychology, such as the uncertainty principle, incompatibility, entanglement, and superposition. For many commentators, this is an exciting opportunity to question existing formal frameworks (notably classical probability theory) and explore what is to be gained by employing these novel conceptual tools. This is not to say that major empirical challenges are not there. For example, can we definitely prove the necessity for quantum, as (...) opposed to classical, models? Can the distinction between compatibility and incompatibility inform our understanding of differences between human and nonhuman cognition? Are quantum models less constrained than classical ones? Does incompatibility arise as a limitation, to avoid the requirements from the principle of unicity, or is it an inherent (or essential?) characteristic of intelligent thought? For everyday judgments, do quantum principles allow more accurate prediction than classical ones? Some questions can be confidently addressed within existing quantum models. A definitive resolution of others will have to anticipate further work. What is clear is that the consideration of quantum cognitive models has enabled a new focus on a range of debates about fundamental aspects of cognition. (shrink)
Many decisions involve multiple stages of choices and events, and these decisions can be represented graphically as decision trees. Optimal decision strategies for decision trees are commonly determined by a backward induction analysis that demands adherence to three fundamental consistency principles: dynamic, consequential, and strategic. Previous research found that decision-makers tend to exhibit violations of dynamic and strategic consistency at rates significantly higher than choice inconsistency across various levels of potential reward. The current research extends these findings under new conditions; (...) specifically, it explores the extent to which these principles are violated as a function of the planning horizon length of the decision tree. Results from two experiments suggest that dynamic inconsistency increases as tree length increases; these results are explained within a dynamic approachâavoidance framework. (shrink)
Though individual categorization or decision processes have been studied separately in many previous investigations, few studies have investigated how they interact by using a two-stage task of first categorizing and then deciding. To address this issue, we investigated a categorization-decision task in two experiments. In both, participants were shown six faces varying in width, first asked to categorize the faces, and then decide a course of action for each face. Each experiment was designed to include three groups, and for each (...) group, we manipulated the probabilistic contingencies between stimulus, category assignments, and decision consequences. For each group, each participant received three different sequences of category response, category feedback, decision response, and decision feedback. We found that participants were only partially responsive in the appropriate directions to the contingencies assigned to each group. Comparisons of results from different sequences provided evidence for empirical interference effects of categorization on decisions. The empirical interference effect is defined as the difference between the probability of taking a hostile action in decision-alone conditions and the total probability of taking a hostile action in categorization-decision conditions. To test competing accounts for multiple empirical results, including two-stage choice probabilities and empirical interference effects, we compared a quantum cognition model versus a two-stage exemplar categorization model at both aggregate and individual levels. Using a Bayesian information criterion, we found that the quantum model provided an overall better model fit than the exemplar model. Although both models predicted empirical interference effects, the exemplar model was able to generate probabilistic deviation by incorporating category information of the first stage into the feature representation of the subsequent decision stage, while the quantum model produced interference effect by superposition, measurement, and quantum entanglement. (shrink)
The study of decision making has traditionally been dominated by axiomatic utility theories. More recently, an alternative approach, which focuses on the micro-mechanisms of the underlying deliberation process, has been shown to account for several "paradoxes" in human choice behavior for which simple utility-based approaches cannot. Decision field theory (DFT) is a cognitive-dynamical model of decision making and preferential choice, built on the fundamental principle that decisions are based on the accumulation of subjective evaluations of choice alternatives until a threshold (...) criterion is met. This article extends the basic DFT framework to the domain of dynamic decision making. DFT-Dynamic is proposed as a new alternative to normative backward induction. Through its attention to the processes underlying planning and deliberation DFT-D provides simple, emergent explanations for violations of choice principles traditionally taken as evidence of irrationality. A recent multistage decision making study is used to showcase the model's efficacy for developing cognitive models of individual strategies. (shrink)
Understanding cognitive processes with a formal framework necessitates some limited, internal prescriptive normativism. This is because it is not possible to endorse the psychological relevance of some axioms in a formal framework, but reject that of others. The empirical challenge then becomes identifying the remit of different formal frameworks, an objective consistent with the descriptivism Elqayam & Evans (E&E) advocate.
A puzzling finding from research on strategic decision making concerns the effect that predictions have on future actions. Simply stating a prediction about an opponent changes the total probability (pooled over predictions) of a player taking a future action as compared to not stating any prediction. These interference effects are difficult to explain using traditional economic models, and instead these results suggest turning to a quantum cognition approach to strategic decision making.