Switch to: References

Add citations

You must login to add citations.
  1. Could a Computer Learn to Be an Appeals Court Judge? The Place of the Unspeakable and Unwriteable in All-Purpose Intelligent Systems.John Woods - 2022 - Philosophies 7 (5):95.
    I will take it that general intelligence is intelligence of the kind that a typical human being—Fred, say—manifests in his role as a cognitive agent, that is, as an acquirer, receiver and circulator of knowledge in his cognitive economy. Framed in these terms, the word “general” underserves our ends. Hereafter our questions will bear upon the all-purpose intelligence of beings like Fred. Frederika appears as Fred’s AI-counterpart, not as a fully programmed and engineered being, but as a presently unrealized theoretical (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  • Are machines radically contextualist?Ryan M. Nefdt - 2023 - Mind and Language 38 (3):750-771.
    In this article, I describe a novel position on the semantics of artificial intelligence. I present a problem for the current artificial neural networks used in machine learning, specifically with relation to natural language tasks. I then propose that from a metasemantic level, meaning in machines can best be interpreted as radically contextualist. Finally, I consider what this might mean for human‐level semantic competence from a comparative perspective.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  • Knowledge-augmented face perception: Prospects for the Bayesian brain-framework to align AI and human vision.Martin Maier, Florian Blume, Pia Bideau, Olaf Hellwich & Rasha Abdel Rahman - 2022 - Consciousness and Cognition 101:103301.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  • (What) Can Deep Learning Contribute to Theoretical Linguistics?Gabe Dupre - 2021 - Minds and Machines 31 (4):617-635.
    Deep learning techniques have revolutionised artificial systems’ performance on myriad tasks, from playing Go to medical diagnosis. Recent developments have extended such successes to natural language processing, an area once deemed beyond such systems’ reach. Despite their different goals, these successes have suggested that such systems may be pertinent to theoretical linguistics. The competence/performance distinction presents a fundamental barrier to such inferences. While DL systems are trained on linguistic performance, linguistic theories are aimed at competence. Such a barrier has traditionally (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  • Finding event structure in time: What recurrent neural networks can tell us about event structure in mind.Forrest Davis & Gerry T. M. Altmann - 2021 - Cognition 213 (C):104651.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  • An Alternative to Cognitivism: Computational Phenomenology for Deep Learning.Pierre Beckmann, Guillaume Köstner & Inês Hipólito - 2023 - Minds and Machines 33 (3):397-427.
    We propose a non-representationalist framework for deep learning relying on a novel method computational phenomenology, a dialogue between the first-person perspective (relying on phenomenology) and the mechanisms of computational models. We thereby propose an alternative to the modern cognitivist interpretation of deep learning, according to which artificial neural networks encode representations of external entities. This interpretation mainly relies on neuro-representationalism, a position that combines a strong ontological commitment towards scientific theoretical entities and the idea that the brain operates on symbolic (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark