Computational Learning Theory and Language Acquisition
AbstractComputational learning theory explores the limits of learnability. Studying language acquisition from this perspective involves identifying classes of languages that are learnable from the available data, within the limits of time and computational resources available to the learner. Diﬀerent models of learning can yield radically diﬀerent learnability results, where these depend on the assumptions of the model about the nature of the learning process, and the data, time, and resources that learners have access to. To the extent that such assumptions accurately reﬂect human language learning, a model that invokes them can oﬀer important insights into the formal properties of natural languages, and the way in which their representations might be eﬃciently acquired. In this chapter we consider several computational learning models that have been applied to the language learning task. Some of these have yielded results that suggest that the class of natural languages cannot be eﬃciently learned from the primary linguistic data (PLD) available to children, through..
Similar books and articles
Complexity in Language Acquisition.Alexander Clark & Shalom Lappin - 2013 - Topics in Cognitive Science 5 (1):89-110.
Machine Learning Theory and Practice as a Source of Insight Into Universal Grammar. StuartmShieber - unknown
Language and the Learning Curve: A New Theory of Syntactic Development.Anat Ninio - 2006 - Oxford University Press.
Machine Learning Theory and Practice as a Source of Insight Into Universal Grammar.Shalom Lappin - unknown
A Probabilistic Computational Model of Cross-Situational Word Learning.Afsaneh Fazly, Afra Alishahi & Suzanne Stevenson - 2010 - Cognitive Science 34 (6):1017-1063.
Language Learning From Positive Evidence, Reconsidered: A Simplicity-Based Approach.Anne S. Hsu, Nick Chater & Paul Vitányi - 2013 - Topics in Cognitive Science 5 (1):35-55.
Learning Colour Words is Slow: A Cross-Situational Learning Account.Paul Vogt & Andrew D. M. Smith - 2005 - Behavioral and Brain Sciences 28 (4):509-510.
iMinerva: A Mathematical Model of Distributional Statistical Learning.Erik D. Thiessen & Philip I. Pavlik - 2013 - Cognitive Science 37 (2):310-343.
Endogenous Constraints on Inductive Reasoning.Andre Kukla - 1992 - Philosophical Psychology 5 (4):411 – 425.
Added to PP
Historical graph of downloads
Citations of this work
On the Necessity of U-Shaped Learning.Lorenzo Carlucci & John Case - 2013 - Topics in Cognitive Science 5 (1):56-88.
A Model of Language Learning with Semantics and Meaning-Preserving Corrections.Dana Angluin & Leonor Becerra-Bonache - 2017 - Artificial Intelligence 242:23-51.
References found in this work
No references found.