Dual-process and dual-system theories in both cognitive and social psychology have been subjected to a number of recently published criticisms. However, they have been attacked as a category, incorrectly assuming there is a generic version that applies to all. We identify and respond to 5 main lines of argument made by such critics. We agree that some of these arguments have force against some of the theories in the literature but believe them to be overstated. We argue that the dual-processing (...) distinction is supported by much recent evidence in cognitive science. Our preferred theoretical approach is one in which rapid autonomous processes are assumed to yield default responses unless intervened on by distinctive higher order reasoning processes. What defines the difference is that Type 2 processing supports hypothetical thinking and load heavily on working memory. (shrink)
This book represents the first major attempt by any author to provide an integrated account of the evidence for bias in human reasoning across a wide range of disparate psychological literatures. The topics discussed involve both deductive and inductive reasoning as well as statistical judgement and inference. In addition, the author proposes a general theoretical approach to the explanations of bias and considers the practical implications for real world decision making. The theoretical stance of the book is based on a (...) distinction between preconscious heuristic processes which determine the mental representation of 'relevant' features of the problem content, and subsequent analytic reasoning processes which generate inferences and judgements. Phenomena discussed and interpreted within this framework include feature matching biases in propositional reasoning, confirmation bias, biasing and debiasing effects of knowledge on reasoning, and biases in statistical judgement normally attributed to 'availability' and 'representativeness' heuristics. In the final chapter, the practical consequences of bias for real life decision making are considered, together with various issues concerning the problem of 'debiasing'. The major approaches discussed are those involving education and training on the one hand, and the development of intelligent software and interactive decision aids on the other. (shrink)
Originally published in 1982, this was an extensive and up-to-date review of research into the psychology of deductive reasoning, Jonathan Evans presents an alternative theoretical framework to the rationalist approach which had dominated much of the published work in this field at the time. The review falls into three sections. The first is concerned with elementary reasoning tasks, in which response latency is the prime measure of interest. The second and third sections are concerned with syllogistic and propositional reasoning respectively, (...) in which interest has focused on the explanation of frequently observed logical errors. In an extended discussion it is argued that reasoning processes are content specific, and give little indication of the operation of any underlying system of logical competence. Finally, a dual process theory of reasoning, with broad implications and connections with other fields of psychology, is elaborated and assessed in the light of recent evidence. (shrink)
We propose a critique of normativism, deﬁned as the idea that human thinking reﬂects a normative system against which it should be measured and judged. We analyze the methodological problems associated with normativism, proposing that it invites the controversial “is-ought” inference, much contested in the philosophical literature. This problem is triggered when there are competing normative accounts (the arbitration problem), as empirical evidence can help arbitrate between descriptive theories, but not between normative systems. Drawing on linguistics as a model, we (...) propose that a clear distinction between normative systems and competence theories is essential, arguing that equating them invites an “is-ought” inference: to wit, supporting normative “ought” theories with empirical “is” evidence. We analyze in detail two research programmes with normativist features – Oaksford and Chater’s rational analysis and Stanovich and West’s individual differences approach – demonstrating how, in each case, equating norm and competence leads to an is-ought inference. Normativism triggers a host of research biases in the psychology of reasoning and decision making: focusing on untrained participants and novel problems, analyzing psychological processes in terms of their normative correlates, and neglecting philosophically signiﬁcant paradigms when they do not supply clear standards for normative judgement. For example, in a dual-process framework, normativism can lead to a fallacious “ought-is” inference, in which normative responses are taken as diagnostic of analytic reasoning. We propose that little can be gained from normativism that cannot be achieved by descriptivist computational-level analysis, illustrating our position with Hypothetical Thinking Theory and the theory of the suppositional conditional. We conclude that descriptivism is a viable option, and that theories of higher mental processing would be better off freed from normative considerations. (shrink)
This book explores the idea that much of our behaviour is controlled by automatic and intuitive mental processes, which shape and compete with our conscious thinking and decision making. Accessibly written, and assuming no prior knowledge of the field, the book will be fascinating reading for all those interested in human behaviour.
This book explores the idea that we have two minds - one automatic, unconscious, and fast, the other controlled, conscious, and slow. It brings together leading researchers on dual-process theory to summarize the state of the art highlight key issues, present different perspectives, and provide a stimulus to further work.
In common with a number of other authors I believe that there has been a paradigm shift in the psychology of reasoning, specifically the area traditionally labelled as the study of deduction. The deduction paradigm was founded in a philosophical tradition that assumed logicality as the basis for rational thought, and provided binary propositional logic as the agreed normative framework. By contrast, many contemporary authors assume that people have degrees of uncertainty in both premises and conclusions, and reject binary logic (...) as a workable normative system. I discuss a number of questions and challenges for this new psychology of reasoning, including the following: (a) Do we need an alternative normative system, such as Bayesianism, for the new paradigm? (b) Is there any longer a clear distinction between the study of deductive and inductive reasoning, the latter having its own tradition and literature? (c) Precisely how is the integrated study of reasoning and decision making facilitated by the new paradigm? (d) What difficulties with dual-processing approaches need to be resolved, if they are to take us forward? (shrink)
We propose a critique ofnormativism, defined as the idea that human thinking reflects a normative system against which it should be measured and judged. We analyze the methodological problems associated with normativism, proposing that it invites the controversial “is-ought” inference, much contested in the philosophical literature. This problem is triggered when there are competing normative accounts, as empirical evidence can help arbitrate between descriptive theories, but not between normative systems. Drawing on linguistics as a model, we propose that a clear (...) distinction between normative systems and competence theories is essential, arguing that equating them invites an “is-ought” inference: to wit, supporting normative “ought” theories with empirical “is” evidence. We analyze in detail two research programmes with normativist features – Oaksford and Chater's rational analysis and Stanovich and West's individual differences approach – demonstrating how, in each case, equating norm and competence leads to an is-ought inference. Normativism triggers a host of research biases in the psychology of reasoning and decision making: focusing on untrained participants and novel problems, analyzing psychological processes in terms of their normative correlates, and neglecting philosophically significant paradigms when they do not supply clear standards for normative judgement. For example, in a dual-process framework, normativism can lead to a fallacious “ought-is” inference, in which normative responses are taken as diagnostic of analytic reasoning. We propose that little can be gained from normativism that cannot be achieved by descriptivist computational-level analysis, illustrating our position with Hypothetical Thinking Theory and the theory of the suppositional conditional. We conclude that descriptivism is a viable option, and that theories of higher mental processing would be better off freed from normative considerations. (shrink)
The study of deductive reasoning has been a major paradigm in psychology for approximately the past 40 years. Research has shown that people make many logical errors on such tasks and are strongly influenced by problem content and context. It is argued that this paradigm was developed in a context of logicist thinking that is now outmoded. Few reasoning researchers still believe that logic is an appropriate normative system for most human reasoning, let alone a model for describing the process (...) of human reasoning, and many use the paradigm principally to study pragmatic and probabilistic processes. It is suggested that the methods used for studying reasoning be reviewed, especially the instructional context, which necessarily defines pragmatic influences as biases. (shrink)
In this paper, I show that the question of how dual process theories of reasoning and judgement account for conflict between System 1 (heuristic) and System 2 (analytic) processes needs to be explicated and addressed in future research work. I demonstrate that a simple additive probability model that describes such conflict can be mapped on to three different cognitive models. The pre-emptive conflict resolution model assumes that a decision is made at the outset as to whether a heuristic or analytic (...) process will control the response. The parallel-competitive model assumes that each system operates in parallel to deliver a putative response, resulting sometimes in conflict that then needs to be resolved. Finally, the default-interventionist model involves the cueing of default responses by the heuristic system that may or may not be altered by subsequent intervention of the analytic system. A second, independent issue also emerges from this discussion. The superior performance of higher-ability participants on reasoning tasks may be due to the fact that they engage in more analytic reasoning ( quantity hypothesis ) or alternatively to the fact that the analytic reasoning they apply is more effective ( quality hypothesis ). (shrink)
'If' is one of the most important words in the English language, being used to express hypothetical thought. The use of conditional terms such as 'if' distinguishes human intelligence from that of all other animals. In this volume, Jonathan Evans and David Over present a new theoretical approach to understanding conditionals. The book draws on studies from the psychology of judgement and decision making, as well as philosophical logic.
The two main psychological theories of the ordinary conditional were designed to account for inferences made from assumptions, but few premises in everyday life can be simply assumed true. Useful premises usually have a probability that is less than certainty. But what is the probability of the ordinary conditional and how is it determined? We argue that people use a two stage Ramsey test that we specify to make probability judgements about indicative conditionals in natural language, and we describe experiments (...) that support this conclusion. Our account can explain why most people give the conditional probability as the probability of the conditional, but also why some give the conjunctive probability. We discuss how our psychological work is related to the analysis of ordinary indicative conditionals in philosophical logic. (shrink)
A general two-stage theory of human inference is proposed. A distinction is drawn between heuristic processes which select items of task information as ‘relevant’, and analytic processes which operate on the selected items to generate inferences or judgements. These two stages are illustrated in a selective review of work on both deductive and statistical reasoning. Factors identified as contributing to heuristic selection include perceptual salience, linguistic suppositions and semantic associations. Analytic processes are considered to be context dependent: people reason from (...) experience, not from inference rules. The paper includes discussion of the theory in comparison with other contemporary theories of human inference, and in relation to the current debate about human rationality. (shrink)
In this study, we examine the belief bias effect in syllogistic reasoning under both standard presentation and in a condition where participants are required to respond within 10 seconds. As predicted, the requirement for rapid responding increased the amount of belief bias observed on the task and reduced the number of logically correct decisions, both effects being substantial and statistically significant. These findings were predicted by the dual-process account of reasoning, which posits that fast heuristic processes, responsible for belief bias, (...) compete with slower analytic processes that can lead to correct logical decisions. Requiring rapid responding thus differentially inhibits the operation of analytic reasoning processes, leading to the results observed. (shrink)
The phenomenon known as matching bias consists of a tendency to see cases as relevant in logical reasoning tasks when the lexical content of a case matches that of a propositional rule, normally a conditional, which applies to that case. Matching is demonstrated by use of the negations paradigm that is by using conditionals in which the presence and absence of negative components is systematically varied. The phenomenon was first published in 1972 and the present paper reviews the history of (...) research and theorising on the problem in the subsequent 25 years. Theories of matching bias considered include those based on several broad frameworks including the heuristic-analytic theory, the mental models theory, the theory of optimal data selection, and relevance theory as well as the specific processing-negations account. The ability of these theories to account for a range of phenomena is considered, including the effects of linguistic form, realistic content, and explicit negation on the matching bias effect. Of particular importance are recent findings showing that the bias is observable on a wider range of linguistic forms than has generally been thought, and that it is almost entirely dependent on the use of implicit negation in the logical cases to which rules are applied. The reasons for the general suppression of matching when realistic content is used are, however, unclear and a need for further research is identified here. It is concluded that matching bias is a highly robust effect which is closely connected with the problem of understanding implicit negation. Most of the theories in the literature are unable to account for at least some of the major phenomena discovered in research on the bias. The accounts that fare best are those that posit local effects of negation, including the heuristic-analytic and processing negations theories. (shrink)
[About the book] This book explores the idea that we have two minds - automatic, unconscious, and fast, the other controlled, conscious, and slow. In recent years there has been great interest in so-called dual-process theories of reasoning and rationality. According to such theories, there are two distinct systems underlying human reasoning - an evolutionarily old system that is associative, automatic, unconscious, parallel, and fast, and a more recent, distinctively human system that is rule-based, controlled, conscious, serial, and slow. Within (...) the former, processes the former, processes are held to be innate and to use heuristics that evolved to solve specific adaptive problems. In the latter, processes are taken to be learned, flexible, and responsive to rational norms. Despite the attention these theories are attracting, there is still poor communication between dual-process theorists themselves, and the substantial bodies of work on dual processes in cognitive psychology and social psychology remain isolated from each other. This book brings together leading researchers on dual processes to summarize the state-of-the-art, highlight key issues, present different perspectives, explore implications, and provide a stimulus to further work. It includes new ideas about the human mind both by contemporary philosophers interested in broad theoretical questions about mental architecture and by psychologists specialising in traditionally distinct and isolated fields. For all those in the cognitive sciences, this is a book that will advance dual-process theorizing, promote interdisciplinary communication, and encourage further applications of dual-process approaches. (shrink)
Originally identified by Hume, the validity of is–ought inference is much debated in the meta-ethics literature. Our work shows that inference from is to ought typically proceeds from contextualised, value-laden causal utility conditional, bridging into a deontic conclusion. Such conditional statements tell us what actions are needed to achieve or avoid consequences that are good or bad. Psychological research has established that people generally reason fluently and easily with utility conditionals. Our own research also has shown that people’s reasoning from (...) is to ought is pragmatically sensitive and adapted to achieving the individual’s goals. But how do we acquire the necessary deontic rules? In this paper, we provide a rationale for this facility linked to Evans’s framework of dual mind rationality. People have an old mind which derives its rationality by repeating what has worked in the past, mostly by experiential learning. New mind rationality, in contrast, is evolutionarily recent, uniquely developed in humans, and draws on our ability to mentally simulate hypothetical events removed in time and place. We contend that the new mind achieves its goals by inducing and applying deontic rules and that a mechanism of deontic introduction evolved for this purpose. (shrink)
M. Oaksford and N. Chater presented a Bayesian analysis of the Wason selection task in which they proposed that people choose cards in order to maximize expected information gain as measured by reduction in uncertainty in the Shannon-Weaver information theory sense. It is argued that the EIG measure is both psychologically implausible and normatively inadequate as a measure of epistemic utility. The article is also concerned with the descriptive account of findings in the selection task literature offered by Oaksford and (...) Chater. First, it is shown that their analysis data reported in the recent article of K. N. Kirby is unsound; second, an EIG analysis is presented of the experiments of P. Pollard and J. St. B. T. Evans that provides a strong empirical disconfirmation of the theory. (shrink)
Our target article identified normativism as the view that rationality should be evaluated against unconditional normative standards. We believe this to be entrenched in the psychological study of reasoning and decision making and argued that it is damaging to this empirical area of study, calling instead for a descriptivist psychology of reasoning and decision making. The views of 29 commentators (from philosophy and cognitive science as well as psychology) were mixed, including some staunch defences of normativism, but also a number (...) that were broadly supportive of our position, although critical of various details. In particular, many defended a position that we call which sees a role for normative evaluation within boundaries alongside more descriptive research goals. In this response, we clarify our use of the term and add discussion of defining both as descriptive and non-normative concepts. We consider the debate with reference to dual-process theory, the psychology of reasoning, and empirical research strategy in these fields. We also discuss cognitive variation by age, intelligence, and culture, and the issue of relative versus absolute definitions of norms. In conclusion, we hope at least to have raised consciousness about the important boundaries between norm and description in the psychology of thinking. (shrink)
I argue that views of human rationality are strongly affected by the adoption of a two minds theory in which humans have an old mind which evolved early and shares many features of animal cognition, as well as new mind which evolved later and is distinctively developed in humans. Both minds have a form of instrumental rationality—striving for the attainment of goals—but by very different mechanisms. The old mind relies on a combination of evolution and experiential learning, and is therefore (...) driven entirely by repeating behaviours which succeeded in the past. The new mind, however, permits the solution of novel problems by reasoning about the future, enabling consequential decision making. I suggest that the concept of epistemic rationality—striving for true knowledge—can only usefully be applied to the new mind with its access to explicit knowledge and beliefs. I also suggest that we commonly interpret behaviour as irrational when the old mind conflicts with the new and frustrates the goals of the conscious person. (shrink)
Dual-process theories of higher cognition, distinguishing between intuitive (Type 1) and reflective (Type 2) thinking, have become increasingly popular, although also subject to recent criticism. A key question, to which a number of contributions in this special issue relate, is how to define the difference between the two kinds of processing. One issue discussed is whether they differ at Marr’s computational level of analysis. I believe they do but that ultimately the debate will decided at the implementational level where distinct (...) cognitive and neural systems need to be demonstrated. Other distinctions raised in the issue are the unique ability for metarepresentation, cognitive decoupling and hypothetical thinking at the Type 2 level, and the association of emotion and metacognitive feelings with the Type 1 level. The relation of the latter to cognitive control is also discussed. (shrink)
Johnson-Laird and Byrne present a theory of conditional inference based upon the manipulation of mental models. In the present paper, the theory is critically examined with regard to its ability to account for psychological data, principally with respect to the rate at which people draw the four basic inferences of modus ponens, denial of the antecedent, affirmation of the consequent and modus tollens. It is argued first that the theory is unclear in its definition and in particular with regard to (...) predictions of problem difficulty. Clarification and specification of principles are consequently provided here. Next, it is argued that there are a number of phenomena in the conditional reasoning literature for which the theory cannot account in its present form. Specifically, the relatively frequency of DA and AC inferences on affirmative conditionals is not as predicted by the theory, differences occur between inferences on if then and only if rules beyond the capacity of the theory to explain and there is no account of the “negative conclusion bias” observed when negated components are introduced into the rules. A number of revisions to the mental model theory of conditional reasoning are proposed in order to account for these findings. (shrink)
We tested the hypothesis that choices determined by Type 1 processes are compelling because they are fluent, and for this reason they are less subject to analytic thinking than other answers. A total of 104 participants completed a modified version of Wason's selection task wherein they made decisions about one card at a time using a two-response paradigm. In this paradigm participants gave a fast, intuitive response, rated their feeling of rightness for that response, and were then allowed free time (...) to reconsider their answers. As we predicted, answers consistent with a matching heuristic were made more quickly than other answers, were given higher FOR ratings, and received less subsequent analysis as measured by rethinking time and the probability of changing answers. These data suggest that reasoning biases may be compelling because they are fluently generated; this is turn creates a strong FOR, which acts as a signal that further analysis is not necessary. (shrink)
Thinking is the essence of what it means to be human and defines us more than anything else as a species. Jonathan Evans explores cognitive psychological approaches to understanding the nature of thinking and reasoning, problem solving, and decision making.
Matching bias in conditional reasoning consists of a tendency to select as relevant cases whose lexical content matches that referred to in the conditional statement, regardless of the presence of negatives. Evans demonstrated that use of explicit rather than implicit negative cases markedly reduced the matching bias effect on the conditional truth table task. In apparent contrast, recent studies of explicit negation on the Wason selection task have failed to find evidence of logical facilitation. Experiment 1 of the present study (...) strongly replicated the Evans findings and extended them to three forms of conditional statement. Experiments 2 and 3 showed further that the use of explicit negatives removed completely the matching bias effect on the Wason selection task. However, consistent with other recent studies, this elimination of bias didnotlead to facilitation of correct responding. The findings are interpreted as providing evidence that matching bias reflects a linguistically cued relevance effect. (shrink)
This qualitative study was positioned within an emerging scientific field concerned with the interaction between biblical text and the psychological profile of the preacher. The theoretical framework was provided by the sensing, intuition, feeling and thinking approach to biblical hermeneutics, an approach rooted in reader-perspective hermeneutical theory and in Jungian psychological type theory that explores the distinctive readings of sensing perception and intuitive perception, and the distinctive readings of thinking evaluation and feeling evaluation. The empirical methodology was provided by developing (...) a research tradition concerned with applying the SIFT approach to biblical text. In the present study, a group of 17 Anglican clergy were invited to work in psychological type-alike groups to explore two of the biblical passages identified by Year B of the Revised Common Lectionary for the Feast of Christ the King. Dividing into three workshops, according to their preferences for sensing and intuition, the clergy explored Psalm 93. Dividing into three workshops, according to their preferences for thinking and feeling, the clergy explored John 18:33–37. The rich data gathered from these workshops supported the hypothesis that biblical interpretation and preaching may be shaped by the reader’s psychological type preference and suggested that the passages of scripture proposed for the Feast of Christ the King may be a joy for intuitive thinking types, but a nightmare for sensing feeling types.Contribution: Situated within the reader perspective approach to biblical hermeneutics, the SIFT method is concerned with identifying the influence of the psychological type of the reader in shaping the interpretation of text. Employing this method, the present study contributes to the fields of homiletics and hermeneutics by demonstrating how some readers may struggle more than others to interpret the scripture readings proposed by the lectionary for the Feast of Christ the King. (shrink)
In two experiments we tested the hypothesis that the mechanisms that produce belief bias generalise across reasoning tasks. In formal reasoning (i.e., syllogisms) judgements of validity are influenced by actual validity, believability of the conclusions, and an interaction between the two. Although apparently analogous effects of belief and argument strength have been observed in informal reasoning, the design of those studies does not permit an analysis of the interaction effect. In the present studies we redesigned two informal reasoning tasks: the (...) Argument Evaluation Task (AET) and a Law of Large Numbers (LLN) task in order to test the similarity of the phenomena concerned. Our findings provide little support for the idea that belief bias on formal and informal reasoning is a unitary phenomenon. First, there was no correlation across individuals in the extent of belief bias shown on the three tasks. Second, evidence for belief by strength interaction was observed only on AET and under conditions not required for the comparable finding on syllogistic reasoning. Finally, we found that while conclusion believability strongly influenced assessments of arguments strength, it had a relatively weak influence on the verbal justifications offered on the two informal reasoning tasks. (shrink)