43 found
Order:
  1.  38
    A counterfactual simulation model of causal judgments for physical events.Tobias Gerstenberg, Noah D. Goodman, David A. Lagnado & Joshua B. Tenenbaum - 2021 - Psychological Review 128 (5):936-975.
  2.  97
    Rational Use of Cognitive Resources: Levels of Analysis Between the Computational and the Algorithmic.Thomas L. Griffiths, Falk Lieder & Noah D. Goodman - 2015 - Topics in Cognitive Science 7 (2):217-229.
    Marr's levels of analysis—computational, algorithmic, and implementation—have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   63 citations  
  3. Knowledge and Implicature: Modeling Language Understanding as Social Cognition.Noah D. Goodman & Andreas Stuhlmüller - 2013 - Topics in Cognitive Science 5 (1):173-184.
    Is language understanding a special case of social cognition? To help evaluate this view, we can formalize it as the rational speech-act theory: Listeners assume that speakers choose their utterances approximately optimally, and listeners interpret an utterance by using Bayesian inference to “invert” this model of the speaker. We apply this framework to model scalar implicature (“some” implies “not all,” and “N” implies “not more than N”). This model predicts an interaction between the speaker's knowledge state and the listener's interpretation. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   53 citations  
  4.  41
    The language of generalization.Michael Henry Tessler & Noah D. Goodman - 2019 - Psychological Review 126 (3):395-436.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   26 citations  
  5.  78
    The double-edged sword of pedagogy: Instruction limits spontaneous exploration and discovery.Elizabeth Bonawitz, Patrick Shafto, Hyowon Gweon, Noah D. Goodman, Elizabeth Spelke & Laura Schulz - 2011 - Cognition 120 (3):322-330.
  6.  63
    The logical primitives of thought: Empirical foundations for compositional cognitive models.Steven T. Piantadosi, Joshua B. Tenenbaum & Noah D. Goodman - 2016 - Psychological Review 123 (4):392-424.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   38 citations  
  7.  37
    A Rational Analysis of Rule‐Based Concept Learning.Noah D. Goodman, Joshua B. Tenenbaum, Jacob Feldman & Thomas L. Griffiths - 2008 - Cognitive Science 32 (1):108-154.
    This article proposes a new model of human concept learning that provides a rational analysis of learning feature‐based concepts. This model is built upon Bayesian inference for a grammatically structured hypothesis space—a concept language of logical rules. This article compares the model predictions to human generalization judgments in several well‐known category learning experiments, and finds good agreement for both average and individual participant generalizations. This article further investigates judgments for a broad set of 7‐feature concepts—a more natural setting in several (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   68 citations  
  8.  38
    When redundancy is useful: A Bayesian approach to “overinformative” referring expressions.Judith Degen, Robert D. Hawkins, Caroline Graf, Elisa Kreiss & Noah D. Goodman - 2020 - Psychological Review 127 (4):591-621.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   14 citations  
  9. Adjectival vagueness in a Bayesian model of interpretation.Daniel Lassiter & Noah D. Goodman - 2017 - Synthese 194 (10):3801-3836.
    We derive a probabilistic account of the vagueness and context-sensitivity of scalar adjectives from a Bayesian approach to communication and interpretation. We describe an iterated-reasoning architecture for pragmatic interpretation and illustrate it with a simple scalar implicature example. We then show how to enrich the apparatus to handle pragmatic reasoning about the values of free variables, explore its predictions about the interpretation of scalar adjectives, and show how this model implements Edgington’s Vagueness: a reader, 1997) account of the sorites paradox, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   21 citations  
  10.  31
    The Division of Labor in Communication: Speakers Help Listeners Account for Asymmetries in Visual Perspective.Robert D. Hawkins, Hyowon Gweon & Noah D. Goodman - 2021 - Cognitive Science 45 (3):e12926.
    Recent debates over adults' theory of mind use have been fueled by surprising failures of perspective-taking in communication, suggesting that perspective-taking may be relatively effortful. Yet adults routinely engage in effortful processes when needed. How, then, should speakers and listeners allocate their resources to achieve successful communication? We begin with the observation that the shared goal of communication induces a natural division of labor: The resources one agent chooses to allocate toward perspective-taking should depend on their expectations about the other's (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  11. (1 other version)The Structure and Dynamics of Scientific Theories: A Hierarchical Bayesian Perspective.Leah Henderson, Noah D. Goodman, Joshua B. Tenenbaum & James F. Woodward - 2010 - Philosophy of Science 77 (2):172-200.
    Hierarchical Bayesian models (HBMs) provide an account of Bayesian inference in a hierarchically structured hypothesis space. Scientific theories are plausibly regarded as organized into hierarchies in many cases, with higher levels sometimes called ‘paradigms’ and lower levels encoding more specific or concrete hypotheses. Therefore, HBMs provide a useful model for scientific theory change, showing how higher‐level theory change may be driven by the impact of evidence on lower levels. HBMs capture features described in the Kuhnian tradition, particularly the idea that (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   37 citations  
  12.  83
    Bootstrapping in a language of thought: A formal model of numerical concept learning.Steven T. Piantadosi, Joshua B. Tenenbaum & Noah D. Goodman - 2012 - Cognition 123 (2):199-217.
  13.  45
    Learning a theory of causality.Noah D. Goodman, Tomer D. Ullman & Joshua B. Tenenbaum - 2011 - Psychological Review 118 (1):110-119.
    No categories
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   34 citations  
  14.  41
    Where science starts: Spontaneous experiments in preschoolers’ exploratory play.Claire Cook, Noah D. Goodman & Laura E. Schulz - 2011 - Cognition 120 (3):341-349.
    No categories
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   30 citations  
  15.  17
    The effects of information utility and teachers’ knowledge on evaluations of under-informative pedagogy across development.Ilona Bass, Elizabeth Bonawitz, Daniel Hawthorne-Madell, Wai Keen Vong, Noah D. Goodman & Hyowon Gweon - 2022 - Cognition 222 (C):104999.
  16.  37
    Going beyond the evidence: Abstract laws and preschoolers’ responses to anomalous data.Laura E. Schulz, Noah D. Goodman, Joshua B. Tenenbaum & Adrianna C. Jenkins - 2008 - Cognition 109 (2):211-223.
  17.  40
    Computational Models of Emotion Inference in Theory of Mind: A Review and Roadmap.Desmond C. Ong, Jamil Zaki & Noah D. Goodman - 2019 - Topics in Cognitive Science 11 (2):338-357.
    An important, but relatively neglected, aspect of human theory of mind is emotion inference: understanding how and why a person feels a certain why is central to reasoning about their beliefs, desires and plans. The authors review recent work that has begun to unveil the structure and determinants of emotion inference, organizing them within a unified probabilistic framework.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  18.  40
    From partners to populations: A hierarchical Bayesian account of coordination and convention.Robert D. Hawkins, Michael Franke, Michael C. Frank, Adele E. Goldberg, Kenny Smith, Thomas L. Griffiths & Noah D. Goodman - 2023 - Psychological Review 130 (4):977-1016.
  19.  45
    Affective cognition: Exploring lay theories of emotion.Desmond C. Ong, Jamil Zaki & Noah D. Goodman - 2015 - Cognition 143 (C):141-162.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  20.  98
    Learning to Learn Causal Models.Charles Kemp, Noah D. Goodman & Joshua B. Tenenbaum - 2010 - Cognitive Science 34 (7):1185-1243.
    Learning to understand a single causal system can be an achievement, but humans must learn about multiple causal systems over the course of a lifetime. We present a hierarchical Bayesian framework that helps to explain how learning about several causal systems can accelerate learning about systems that are subsequently encountered. Given experience with a set of objects, our framework learns a causal model for each object and a causal schema that captures commonalities among these causal models. The schema organizes the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  21.  27
    Remembrance of inferences past: Amortization in human hypothesis generation.Ishita Dasgupta, Eric Schulz, Noah D. Goodman & Samuel J. Gershman - 2018 - Cognition 178 (C):67-81.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  22.  54
    How many kinds of reasoning? Inference, probability, and natural language semantics.Daniel Lassiter & Noah D. Goodman - 2015 - Cognition 136 (C):123-134.
    No categories
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  23.  83
    The Strategic Use of Noise in Pragmatic Reasoning.Leon Bergen & Noah D. Goodman - 2015 - Topics in Cognitive Science 7 (2):336-350.
    We combine two recent probabilistic approaches to natural language understanding, exploring the formal pragmatics of communication on a noisy channel. We first extend a model of rational communication between a speaker and listener, to allow for the possibility that messages are corrupted by noise. In this model, common knowledge of a noisy channel leads to the use and correct understanding of sentence fragments. A further extension of the model, which allows the speaker to intentionally reduce the noise rate on a (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  24.  31
    The Interactions of Rational, Pragmatic Agents Lead to Efficient Language Structure and Use.Benjamin N. Peloquin, Noah D. Goodman & Michael C. Frank - 2020 - Topics in Cognitive Science 12 (1):433-445.
    Despite their diversity, human languages share consistent properties and regularities. Wherefrom does this consistency arise? And does it tell us something about the problem that all languages need to solve? The authors provide an intriguing analyses which focuses on the “communicative function of ambiguity” whose resolution entailed an equally intriguing “speaker–listener cross‐entropy objective for measuring the efficiency of linguistic systems from first principles of efficient language use.”.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  25.  22
    Characterizing the Dynamics of Learning in Repeated Reference Games.Robert D. Hawkins, Michael C. Frank & Noah D. Goodman - 2020 - Cognitive Science 44 (6):e12845.
    The language we use over the course of conversation changes as we establish common ground and learn what our partner finds meaningful. Here we draw upon recent advances in natural language processing to provide a finer‐grained characterization of the dynamics of this learning process. We release an open corpus (>15,000 utterances) of extended dyadic interactions in a classic repeated reference game task where pairs of participants had to coordinate on how to refer to initially difficult‐to‐describe tangram stimuli. We find that (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  26. Learning causal schemata.Charles Kemp, Noah D. Goodman & Joshua B. Tenenbaum - 2007 - In McNamara D. S. & Trafton J. G. (eds.), Proceedings of the 29th Annual Cognitive Science Society. Cognitive Science Society. pp. 389--394.
     
    Export citation  
     
    Bookmark   9 citations  
  27.  28
    Extremely costly intensifiers are stronger than quite costly ones.Erin D. Bennett & Noah D. Goodman - 2018 - Cognition 178:147-161.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  28.  61
    A Computational Model of Linguistic Humor in Puns.Justine T. Kao, Roger Levy & Noah D. Goodman - 2016 - Cognitive Science 40 (5):1270-1285.
    Humor plays an essential role in human interactions. Precisely what makes something funny, however, remains elusive. While research on natural language understanding has made significant advancements in recent years, there has been little direct integration of humor research with computational models of language understanding. In this paper, we propose two information-theoretic measures—ambiguity and distinctiveness—derived from a simple model of sentence processing. We test these measures on a set of puns and regular sentences and show that they correlate significantly with human (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  29. Cause and intent: Social reasoning in causal learning.Noah D. Goodman, Chris L. Baker & Joshua B. Tenenbaum - 2009 - In N. A. Taatgen & H. van Rijn (eds.), Proceedings of the 31st Annual Conference of the Cognitive Science Society. pp. 2759--2764.
     
    Export citation  
     
    Bookmark   7 citations  
  30. How tall is Tall? compositionality, statistics, and gradable adjectives.Lauren A. Schmidt, Noah D. Goodman, David Barner & Joshua B. Tenenbaum - 2009 - In N. A. Taatgen & H. van Rijn (eds.), Proceedings of the 31st Annual Conference of the Cognitive Science Society.
     
    Export citation  
     
    Bookmark   7 citations  
  31. Probabilistic semantics and pragmatics : uncertainty in language and thought.Noah D. Goodman & Daniel Lassiter - 1996 - In Shalom Lappin (ed.), The handbook of contemporary semantic theory. Cambridge, Mass., USA: Blackwell Reference.
  32. Informative communication in word production and word learning.Michael C. Frank, Noah D. Goodman, Peter Lai & Joshua B. Tenenbaum - 2009 - In N. A. Taatgen & H. van Rijn (eds.), Proceedings of the 31st Annual Conference of the Cognitive Science Society.
     
    Export citation  
     
    Bookmark   5 citations  
  33.  16
    Resolving uncertainty in plural predication.Gregory Scontras & Noah D. Goodman - 2017 - Cognition 168 (C):294-311.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  34.  20
    Analyzing Machine‐Learned Representations: A Natural Language Case Study.Ishita Dasgupta, Demi Guo, Samuel J. Gershman & Noah D. Goodman - 2020 - Cognitive Science 44 (12):e12925.
    As modern deep networks become more complex, and get closer to human‐like capabilities in certain domains, the question arises as to how the representations and decision rules they learn compare to the ones in humans. In this work, we study representations of sentences in one such artificial system for natural language processing. We first present a diagnostic test dataset to examine the degree of abstract composable structure represented. Analyzing performance on these diagnostic tests indicates a lack of systematicity in representations (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  35.  11
    Probabilistic programming versus meta-learning as models of cognition.Desmond C. Ong, Tan Zhi-Xuan, Joshua B. Tenenbaum & Noah D. Goodman - 2024 - Behavioral and Brain Sciences 47:e158.
    We summarize the recent progress made by probabilistic programming as a unifying formalism for the probabilistic, symbolic, and data-driven aspects of human cognition. We highlight differences with meta-learning in flexibility, statistical assumptions and inferences about cogniton. We suggest that the meta-learning approach could be further strengthened by considering Connectionist and Bayesian approaches, rather than exclusively one or the other.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  36. Beyond Boolean logic: exploring representation languages for learning complex concepts.Steven T. Piantadosi, Joshua B. Tenenbaum & Noah D. Goodman - 2010 - In S. Ohlsson & R. Catrambone (eds.), Proceedings of the 32nd Annual Conference of the Cognitive Science Society. Cognitive Science Society. pp. 859--864.
  37. Relational and role-governed categories: Views from psychology, computational modeling, and linguistics.Micah B. Goldwater, Noah D. Goodman, Stephen Wechsler & Gregory L. Murphy - 2009 - In N. A. Taatgen & H. van Rijn (eds.), Proceedings of the 31st Annual Conference of the Cognitive Science Society.
  38.  30
    Compositionality in rational analysis: Grammar-based induction for concept learning.Noah D. Goodman, Joshua B. Tenenbaum, Thomas L. Griffiths & Jacob Feldman - 2008 - In Nick Chater & Mike Oaksford (eds.), The Probabilistic Mind: Prospects for Bayesian Cognitive Science. Oxford University Press.
    Direct download  
     
    Export citation  
     
    Bookmark   2 citations  
  39.  43
    Logic, Probability, and Pragmatics in Syllogistic Reasoning.Michael Henry Tessler, Joshua B. Tenenbaum & Noah D. Goodman - 2022 - Topics in Cognitive Science 14 (3):574-601.
    Topics in Cognitive Science, Volume 14, Issue 3, Page 574-601, July 2022.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  40.  43
    Comparing pluralities.Gregory Scontras, Peter Graff & Noah D. Goodman - 2012 - Cognition 123 (1):190-197.
  41.  27
    Warm (for Winter): Inferring Comparison Classes in Communication.Michael Henry Tessler & Noah D. Goodman - 2022 - Cognitive Science 46 (3):e13095.
    Cognitive Science, Volume 46, Issue 3, March 2022.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  42. Compositionality in rational analysis: grammar-based induction for concept learning.Noah D. Goodman, Joshua B. Tenenbaum, Thomas L. Griffiths & Feldman & Jacob - 2008 - In Nick Chater & Mike Oaksford (eds.), The Probabilistic Mind: Prospects for Bayesian Cognitive Science. Oxford University Press.
     
    Export citation  
     
    Bookmark   1 citation  
  43.  22
    Avoiding frostbite: It helps to learn from others.Michael Henry Tessler, Noah D. Goodman & Michael C. Frank - 2017 - Behavioral and Brain Sciences 40.
    Machines that learn and think like people must be able to learn from others. Social learning speeds up the learning process and – in combination with language – is a gateway to abstract and unobservable information. Social learning also facilitates the accumulation of knowledge across generations, helping people and artificial intelligences learn things that no individual could learn in a lifetime.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark