Results for 'grammar induction, probabilistic context-free grammars, the EM algorithm'

1000+ found
Order:
  1.  30
    構文森を用いた実コーパスからの大規模な文脈自由文法の高速学習法.亀谷 由隆 栗原 賢一 - 2004 - Transactions of the Japanese Society for Artificial Intelligence 19:360-367.
    The task of inducing grammar structures has received a great deal of attention. The reasons why researchers have studied are different; to use grammar induction as the first stage in building large treebanks or to make up better language models. However, grammar induction has inherent computational complexity. To overcome it, some grammar induction algorithms add new production rules incrementally. They refine the grammar while keeping their computational complexity low. In this paper, we propose a new (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  2.  41
    An Ç ´Ò¿ µ Agenda-Based Chart Parser for Arbitrary Probabilistic Context-Free Grammars.Dan Klein & Christopher D. Manning - unknown
    While Ç ´Ò¿ µ methods for parsing probabilistic context-free grammars (PCFGs) are well known, a tabular parsing framework for arbitrary PCFGs which allows for botton-up, topdown, and other parsing strategies, has not yet been provided. This paper presents such an algorithm, and shows its correctness and advantages over prior work. The paper finishes by bringing out the connections between the algorithm and work on hypergraphs, which permits us to extend the presented Viterbi (best parse) (...) to an inside (total probability) algorithm. (shrink)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  3.  30
    An ¢¡¤£¦¥¨§ agenda-based chart parser for arbitrary probabilistic context-free grammars.Christopher Manning - manuscript
    fundamental rule” in an order-independent manner, such that the same basic algorithm supports top-down and Most PCFG parsing work has used the bottom-up bottom-up parsing, and the parser deals correctly with CKY algorithm (Kasami, 1965; Younger, 1967) with the difficult cases of left-recursive rules, empty elements, Chomsky Normal Form Grammars (Baker, 1979; Jeand unary rules, in a natural way.
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  4.  58
    A note on the expressive power of probabilistic context free grammars.Gabriel Infante-Lopez & Maarten De Rijke - 2006 - Journal of Logic, Language and Information 15 (3):219-231.
    We examine the expressive power of probabilistic context free grammars (PCFGs), with a special focus on the use of probabilities as a mechanism for reducing ambiguity by filtering out unwanted parses. Probabilities in PCFGs induce an ordering relation among the set of trees that yield a given input sentence. PCFG parsers return the trees bearing the maximum probability for a given sentence, discarding all other possible trees. This mechanism is naturally viewed as a way of defining a (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  5.  7
    A Note on the Expressive Power of Probabilistic Context Free Grammars.Gabriel Infante-Lopez & Maarten Rijke - 2006 - Journal of Logic, Language and Information 15 (3):219-231.
    We examine the expressive power of probabilistic context free grammars (PCFGs), with a special focus on the use of probabilities as a mechanism for reducing ambiguity by filtering out unwanted parses. Probabilities in PCFGs induce an ordering relation among the set of trees that yield a given input sentence. PCFG parsers return the trees bearing the maximum probability for a given sentence, discarding all other possible trees. This mechanism is naturally viewed as a way of defining a (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  6.  36
    From Exemplar to Grammar: A Probabilistic Analogy‐Based Model of Language Learning.Rens Bod - 2009 - Cognitive Science 33 (5):752-793.
    While rules and exemplars are usually viewed as opposites, this paper argues that they form end points of the same distribution. By representing both rules and exemplars as (partial) trees, we can take into account the fluid middle ground between the two extremes. This insight is the starting point for a new theory of language learning that is based on the following idea: If a language learner does not know which phrase‐structure trees should be assigned to initial sentences, s/he allows (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   15 citations  
  7.  26
    Natural Language Grammar Induction using a Constituent-Context Model.Dan Klein & Christopher D. Manning - unknown
    This paper presents a novel approach to the unsupervised learning of syntactic analyses of natural language text. Most previous work has focused on maximizing likelihood according to generative PCFG models. In contrast, we employ a simpler probabilistic model over trees based directly on constituent identity and linear context, and use an EM-like iterative procedure to induce structure. This method produces much higher quality analyses, giving the best published results on the ATIS dataset.
    Direct download  
     
    Export citation  
     
    Bookmark   5 citations  
  8.  43
    Natural language grammar induction using a constituent-context model.Christopher Manning - manuscript
    This paper presents a novel approach to the unsupervised learning of syntactic analyses of natural language text. Most previous work has focused on maximizing likelihood according to generative PCFG models. In contrast, we employ a simpler probabilistic model over trees based directly on constituent identity and linear context, and use an EM-like iterative procedure to induce structure. This method produces much higher quality analyses, giving the best published results on the ATIS dataset.
    Direct download  
     
    Export citation  
     
    Bookmark  
  9.  33
    A Generative Constituent-Context Model for Improved Grammar Induction.Dan Klein & Christopher D. Manning - unknown
    We present a generative distributional model for the unsupervised induction of natural language syntax which explicitly models constituent yields and contexts. Parameter search with EM produces higher quality analyses than previously exhibited by unsupervised systems, giving the best published unsupervised parsing results on the ATIS corpus. Experiments on Penn treebank sentences of comparable length show an even higher F1 of 71% on nontrivial brackets. We compare distributionally induced and actual part-of-speech tags as input data, and examine extensions to the basic (...)
    Direct download  
     
    Export citation  
     
    Bookmark   4 citations  
  10.  49
    Grammar induction by unification of type-logical lexicons.Sean A. Fulop - 2010 - Journal of Logic, Language and Information 19 (3):353-381.
    A method is described for inducing a type-logical grammar from a sample of bare sentence trees which are annotated by lambda terms, called term-labelled trees . Any type logic from a permitted class of multimodal logics may be specified for use with the procedure, which induces the lexicon of the grammar including the grammatical categories. A first stage of semantic bootstrapping is performed, which induces a general form lexicon from the sample of term-labelled trees using Fulop’s (J Log (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  11.  51
    On the expressive power of abstract categorial grammars: Representing context-free formalisms. [REVIEW]Philippe de Groote & Sylvain Pogodalla - 2004 - Journal of Logic, Language and Information 13 (4):421-438.
    We show how to encode context-free string grammars, linear context-free tree grammars, and linear context-free rewriting systems as Abstract Categorial Grammars. These three encodings share the same constructs, the only difference being the interpretation of the composition of the production rules. It is interpreted as a first-order operation in the case of context-free string grammars, as a second-order operation in the case of linear context-free tree grammars, and as a third-order (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  12.  29
    IDL-PMCFG, a Grammar Formalism for Describing Free Word Order Languages.François Hublet - 2022 - Journal of Logic, Language and Information 31 (3):327-388.
    We introduce _Interleave-Disjunction-Lock parallel multiple context-free grammars_ (IDL-PMCFG), a novel grammar formalism designed to describe the syntax of free word order languages that allow for extensive interleaving of grammatical constituents. Though interleaved constituents, and especially the so-called hyperbaton, are common in several ancient (Classical Latin and Greek, Sanskrit...) and modern (Hungarian, Finnish...) languages, these syntactic structures are often difficult to express in existing formalisms. The IDL-PMCFG formalism combines Seki et al.’s parallel multiple context-free grammars (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  13.  14
    The Equivalence of Unidirectional Lambek Categorial Grammars and ContextFree Grammars.Wojcßch Buszkowski - 1985 - Mathematical Logic Quarterly 31 (24):369-384.
    Direct download  
     
    Export citation  
     
    Bookmark   8 citations  
  14.  25
    The Equivalence of Unidirectional Lambek Categorial Grammars and Context-Free Grammars.Wojcßch Buszkowski - 1985 - Zeitschrift fur mathematische Logik und Grundlagen der Mathematik 31 (24):369-384.
    Direct download  
     
    Export citation  
     
    Bookmark   8 citations  
  15.  32
    The Equivalence of Tree Adjoining Grammars and Monadic Linear Context-free Tree Grammars.Stephan Kepser & Jim Rogers - 2011 - Journal of Logic, Language and Information 20 (3):361-384.
    The equivalence of leaf languages of tree adjoining grammars and monadic linear context-free grammars was shown about a decade ago. This paper presents a proof of the strong equivalence of these grammar formalisms. Non-strict tree adjoining grammars and monadic linear context-free grammars define the same class of tree languages. We also present a logical characterisation of this tree language class showing that a tree language is a member of this class iff it is the two-dimensional (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  16.  14
    On the Expressive Power of Abstract Categorial Grammars: Representing Context-Free Formalisms.Philippe Groote & Sylvain Pogodalla - 2004 - Journal of Logic, Language and Information 13 (4):421-438.
    We show how to encode context-free string grammars, linear context-free tree grammars, and linear context-free rewriting systems as Abstract Categorial Grammars. These three encodings share the same constructs, the only difference being the interpretation of the composition of the production rules. It is interpreted as a first-order operation in the case of context-free string grammars, as a second-order operation in the case of linear context-free tree grammars, and as a third-order (...)
    Direct download  
     
    Export citation  
     
    Bookmark   4 citations  
  17.  33
    The metaphoricality of Marxism and the context-freeing grammar of socialism.Alvin W. Gouldner - 1974 - Theory and Society 1 (4):387-414.
  18.  28
    The equivalence of Nonassociative Lambek Categorial Grammars and ContextFree Grammars.Maciej Kandulski - 1988 - Mathematical Logic Quarterly 34 (1):41-52.
    Direct download  
     
    Export citation  
     
    Bookmark   10 citations  
  19.  34
    The equivalence of Nonassociative Lambek Categorial Grammars and Context-Free Grammars.Maciej Kandulski - 1988 - Zeitschrift fur mathematische Logik und Grundlagen der Mathematik 34 (1):41-52.
    Direct download  
     
    Export citation  
     
    Bookmark   7 citations  
  20.  52
    Product-free Lambek calculus and context-free grammars.Mati Pentus - 1997 - Journal of Symbolic Logic 62 (2):648-660.
    In this paper we prove the Chomsky Conjecture (all languages recognized by the Lambek calculus are context-free) for both the full Lambek calculus and its product-free fragment. For the latter case we present a construction of context-free grammars involving only product-free types.
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  21.  13
    Properties of the Derivations According to a Context-Free Grammar.Gabriel Orman - 1973 - In Radu J. Bogdan & Ilkka Niiniluoto (eds.), Logic, Language, and Probability. Boston: D. Reidel Pub. Co.. pp. 226--236.
    Direct download  
     
    Export citation  
     
    Bookmark  
  22.  11
    Feature Selection for a Rich HPSG Grammar Using Decision Trees.Christopher D. Manning & Kristina Toutanova - unknown
    This paper examines feature selection for log linear models over rich constraint-based grammar (HPSG) representations by building decision trees over features in corresponding probabilistic context free grammars (PCFGs). We show that single decision trees do not make optimal use of the available information; constructed ensembles of decision trees based on different feature subspaces show signifi- cant performance gains (14% parse selection error reduction). We compare the performance of the learned PCFG grammars and log linear models over (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  23. Sequentially indexed grammars.Jan van Eijck - unknown
    This paper defines the grammar class of sequentially indexed grammars. Sequentially indexed grammars are the result of a change in the index stack handling mechanism of indexed grammars [Aho68, Aho69]. Sequentially indexed grammars are different from linear indexed grammars [Gaz88]. Like indexed languages, sequentially indexed languages are a fully abstract language class. Unlike indexed languages, sequentially indexed languages allow polynomial parsing algorithms. We give a polynomial algorithm for parsing with sequentially indexed gramamrs that is an extension of the (...)
     
    Export citation  
     
    Bookmark  
  24.  26
    Sheila Greibach. A new normal-form theorem for context-free phrase structure grammars. Journal of the Association for Computing Machinery, vol. 12 (1965), pp. 42–52. [REVIEW]Rohit Parikh - 1970 - Journal of Symbolic Logic 34 (4):658-658.
  25. Calibrating Generative Models: The Probabilistic Chomsky-Schützenberger Hierarchy.Thomas Icard - 2020 - Journal of Mathematical Psychology 95.
    A probabilistic Chomsky–Schützenberger hierarchy of grammars is introduced and studied, with the aim of understanding the expressive power of generative models. We offer characterizations of the distributions definable at each level of the hierarchy, including probabilistic regular, context-free, (linear) indexed, context-sensitive, and unrestricted grammars, each corresponding to familiar probabilistic machine classes. Special attention is given to distributions on (unary notations for) positive integers. Unlike in the classical case where the "semi-linear" languages all collapse into (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  26.  12
    構文解析にもとづく規則生成と規則集合探索による文脈自由文法の漸次学習.保科 明美 中村 克彦 - 2006 - Transactions of the Japanese Society for Artificial Intelligence 21 (4):371-379.
    This paper discusses recent improvements and extensions in Synapse system for inductive inference of context free grammars from sample strings. Synapse uses incremental learning, rule generation based on bottom-up parsing, and the search for rule sets. The form of production rules in the previous system is extended from Revised Chomsky Normal Form A →βγ to Extended Chomsky Normal Form, which also includes A → B, where each of β and γ is either a terminal or nonterminal symbol. From (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  27. Natural languages and context-free languages.Geoffrey K. Pullum & Gerald Gazdar - 1980 - Linguistics and Philosophy 4 (4):471 - 504.
    Notice that this paper has not claimed that all natural languages are CFL's. What it has shown is that every published argument purporting to demonstrate the non-context-freeness of some natural language is invalid, either formally or empirically or both.18 Whether non-context-free characteristics can be found in the stringset of some natural language remains an open question, just as it was a quarter century ago.Whether the question is ultimately answered in the negative or the affirmative, there will be (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   26 citations  
  28.  31
    Induction of Augmented Transition Networks.John R. Anderson - 1977 - Cognitive Science 1 (2):125-157.
    LAS is a program that acquires augmented transition network (ATN) grammars. It requires as data sentences of the language and semantic network representatives of their meaning. In acquiring the ATN grammars, it induces the word classes of the language, the rules of formation for sentences, and the rules mapping sentences onto meaning. The induced ATN grammar can be used both for sentence generation and sentence comprehension. Critical to the performance of the program are assumptions that it makes about the (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   13 citations  
  29.  18
    Formalization of Context-Free Language Theory.Marcus Vinícius Midena Ramos - 2019 - Bulletin of Symbolic Logic 25 (2):214-214.
    Proof assistants are software-based tools that are used in the mechanization of proof construction and validation in mathematics and computer science, and also in certified program development. Different such tools are being increasingly used in order to accelerate and simplify proof checking, and the Coq proof assistant is one of the most well known and used in large-scale projects. Language and automata theory is a well-established area of mathematics, relevant to computer science foundations and information technology. In particular, context- (...) language theory is of fundamental importance in the analysis, design, and implementation of computer programming languages. This work describes a formalization effort, using the Coq proof assistant, of fundamental results of the classical theory of contextfree grammars and languages. These include closure properties (union, concatenation, and Kleene star), grammar simplification (elimination of useless symbols, inaccessible symbols, empty rules, and unit rules), the existence of a Chomsky Normal Form for context-free grammars and the Pumping Lemma for context-free languages. The result is an important set of libraries covering the main results of context-free language theory, with more than 500 lemmas and theorems fully proved and checked. As it turns out, this is a comprehensive formalization of the classical context-free language theory in the Coq proof assistant and includes the formalization of the Pumping Lemma for context-free languages. The perspectives for the further development of this work are diverse and can be grouped in three different areas: inclusion of new devices and results, code extraction, and general enhancements of its libraries. (shrink)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  30. Deductive parsing with sequentially indexed grammars.Jan van Eijck - unknown
    This paper extends the Earley parsing algorithm for context free languages [3] to the case of sequentially indexed languages. Sequentially indexed languages are related to indexed languages [1, 2]. The difference is that parallel processing of index stacks is replaced by sequential processing [4].
     
    Export citation  
     
    Bookmark  
  31. Unsupervised learning and grammar induction.Alex Clark & Shalom Lappin - unknown
    In this chapter we consider unsupervised learning from two perspectives. First, we briefly look at its advantages and disadvantages as an engineering technique applied to large corpora in natural language processing. While supervised learning generally achieves greater accuracy with less data, unsupervised learning offers significant savings in the intensive labour required for annotating text. Second, we discuss the possible relevance of unsupervised learning to debates on the cognitive basis of human language acquisition. In this context we explore the implications (...)
     
    Export citation  
     
    Bookmark  
  32.  47
    Mild context-sensitivity and tuple-based generalizations of context-grammar.Annius V. Groenink - 1997 - Linguistics and Philosophy 20 (6):607-636.
    This paper classifies a family of grammar formalisms that extendcontext-free grammar by talking about tuples of terminal strings, ratherthan independently combining single terminal words into larger singlephrases. These include a number of well-known formalisms, such as headgrammar and linear context-free rewriting systems, but also a new formalism,(simple) literal movement grammar, which strictly extends the previouslyknown formalisms, while preserving polynomial time recognizability.The descriptive capacity of simple literal movement grammars isillustrated both formally through a weak generative (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  33.  30
    Non‐associative Lambek Categorial Grammar in Polynomial Time.Erik Aarts & Kees Trautwein - 1995 - Mathematical Logic Quarterly 41 (4):476-484.
    We present a new axiomatization of the non-associative Lambek calculus. We prove that it takes polynomial time to reduce any non-associative Lambek categorial grammar to an equivalent context-free grammar. Since it is possible to recognize a sentence generated by a context-free grammar in polynomial time, this proves that a sentence generated by any non-associative Lambek categorial grammar can be recognized in polynomial time.
    Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  34.  49
    No Free Lunch Theorem, Inductive Skepticism, and the Optimality of Meta-induction.Gerhard Schurz - 2017 - Philosophy of Science 84 (5):825-839.
    The no free lunch theorem is a radicalized version of Hume’s induction skepticism. It asserts that relative to a uniform probability distribution over all possible worlds, all computable prediction algorithms—whether ‘clever’ inductive or ‘stupid’ guessing methods —have the same expected predictive success. This theorem seems to be in conflict with results about meta-induction. According to these results, certain meta-inductive prediction strategies may dominate other methods in their predictive success. In this article this conflict is analyzed and dissolved, by means (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  35.  7
    Bidirectional context-free grammar parsing for natural language processing.Giorgio Satta & Oliviero Stock - 1994 - Artificial Intelligence 69 (1-2):123-164.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  36.  47
    Identification in the limit of categorial grammars.Makoto Kanazawa - 1996 - Journal of Logic, Language and Information 5 (2):115-155.
    It is proved that for any k, the class of classical categorial grammars that assign at most k types to each symbol in the alphabet is learnable, in the Gold (1967) sense of identification in the limit from positive data. The proof crucially relies on the fact that the concept known as finite elasticity in the inductive inference literature is preserved under the inverse image of a finite-valued relation. The learning algorithm presented here incorporates Buszkowski and Penn's (1990) (...) for determining categorial grammars from input consisting of functor-argument structures. (shrink)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  37.  78
    Implicit Acquisition of Grammars With Crossed and Nested Non-Adjacent Dependencies: Investigating the Push-Down Stack Model.Julia Uddén, Martin Ingvar, Peter Hagoort & Karl M. Petersson - 2012 - Cognitive Science 36 (6):1078-1101.
    A recent hypothesis in empirical brain research on language is that the fundamental difference between animal and human communication systems is captured by the distinction between finite-state and more complex phrase-structure grammars, such as context-free and context-sensitive grammars. However, the relevance of this distinction for the study of language as a neurobiological system has been questioned and it has been suggested that a more relevant and partly analogous distinction is that between non-adjacent and adjacent dependencies. Online memory (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  38.  47
    On Throwing Out the Baby with the Bathwater: A Reply to Black and Wilensky's Evaluation of Story Grammars.Jean M. Mandler & Nancy S. Johnson - 1980 - Cognitive Science 4 (3):305-312.
    A number of criticisms of a recent paper byare made. (1) In attempting to assess the observational adequacy of story grammars, they state that a contextfree grammar cannot handle discontinuous elements; however, they do not show that such elements occur in the domain to which the grammars apply. Further, they do not present adequate evidence for their claim that there are acceptable stories not accounted for by existing grammars and that the grammars will accept nonstories such as (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   29 citations  
  39. Context and Coherence: The Logic and Grammar of Prominence.Una Stojnic - 2021 - Oxford, UK: Oxford University Press.
    Natural languages are riddled with context-sensitivity. One and the same string of words can express many different meanings on occasion of use, and yet we understand one another effortlessly, on the fly. How do we do so? What fixes the meaning of context-sensitive expressions, and how are we able to recover the meaning so effortlessly? -/- This book offers a novel response: we can do so because we draw on a broad array of subtle linguistic conventions that determine (...)
  40. Forgetting is learning-evaluation of 3 induction algorithms for learning artificial grammars.Rc Mathews, B. Druhan & L. Roussel - 1989 - Bulletin of the Psychonomic Society 27 (6):516-516.
  41. Nicolas Ruwet.in Generative Grammar - 1981 - In W. Klein & W. Levelt (eds.), Crossing the Boundaries in Linguistics. Reidel. pp. 23.
    No categories
     
    Export citation  
     
    Bookmark  
  42.  4
    Primary works.Rational Grammar - 2005 - In Siobhan Chapman & Christopher Routledge (eds.), Key thinkers in linguistics and the philosophy of language. Edinburgh: Edinburgh University Press. pp. 10.
    Direct download  
     
    Export citation  
     
    Bookmark  
  43.  13
    The Formal Theory of Grammar[REVIEW]L. J. - 1975 - Review of Metaphysics 28 (3):557-558.
    Since a human language consists of an infinite number of sentences, it cannot be adequately described by enumeration. Hence, as Chomsky wrote in the first paragraph of his first book, Syntactic Structures, an adequate description of a language is approached through the specification of a generative device that will generate and structurally describe all the sentences of a language. And since generative devices form a hierarchy in terms of descriptive power, the basic question of grammar is what is the (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  44.  28
    Compositionality in rational analysis: Grammar-based induction for concept learning.Noah D. Goodman, Joshua B. Tenenbaum, Thomas L. Griffiths & Jacob Feldman - 2008 - In Nick Chater & Mike Oaksford (eds.), The Probabilistic Mind: Prospects for Bayesian Cognitive Science. Oxford University Press.
    Direct download  
     
    Export citation  
     
    Bookmark   2 citations  
  45. Highly constrained unification grammars.Daniel Feinstein & Shuly Wintner - 2008 - Journal of Logic, Language and Information 17 (3):345-381.
    Unification grammars are widely accepted as an expressive means for describing the structure of natural languages. In general, the recognition problem is undecidable for unification grammars. Even with restricted variants of the formalism, off-line parsable grammars, the problem is computationally hard. We present two natural constraints on unification grammars which limit their expressivity and allow for efficient processing. We first show that non-reentrant unification grammars generate exactly the class of context-free languages. We then relax the constraint and show (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  46. Compositionality in rational analysis: grammar-based induction for concept learning.Noah D. Goodman, Joshua B. Tenenbaum, Thomas L. Griffiths & Feldman & Jacob - 2008 - In Nick Chater & Mike Oaksford (eds.), The Probabilistic Mind: Prospects for Bayesian Cognitive Science. Oxford University Press.
     
    Export citation  
     
    Bookmark  
  47.  48
    English Grammar Error Correction Algorithm Based on Classification Model.Shanchun Zhou & Wei Liu - 2021 - Complexity 2021:1-11.
    English grammar error correction algorithm refers to the use of computer programming technology to automatically recognize and correct the grammar errors contained in English text written by nonnative language learners. Classification model is the core of machine learning and data mining, which can be applied to extracting information from English text data and constructing a reliable grammar correction method. On the basis of summarizing and analyzing previous research works, this paper expounded the research status and significance (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  48.  19
    Context, Content, and the Occasional Costs of Implicature Computation.Raj Singh - 2019 - Frontiers in Psychology 10:456058.
    The computation of scalar implicatures is sometimes costly relative to basic meanings. Among the costly computations are those that involve strengthening `some' to `not all' and strengthening inclusive disjunction to exclusive disjunction. The opposite is true for some other cases of strengthening, where the strengthened meaning is less costly than its corresponding basic meaning. These include conjunctive strengthenings of disjunctive sentences (e.g., free-choice inferences) and exactly-readings of numerals. Assuming that these are indeed all instances of strengthening via implicature/exhaustification, the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  49.  9
    A connectionist parser for context-free phrase structure grammars.Rolf Wilkens & Helmut Schnelle - 1990 - In G. Dorffner (ed.), Konnektionismus in Artificial Intelligence Und Kognitionsforschung. Berlin: Springer-Verlag. pp. 38--47.
  50.  18
    Commutative Lambek Grammars.Tikhon Pshenitsyn - 2023 - Journal of Logic, Language and Information 32 (5):887-936.
    Lambek categorial grammars is a class of formal grammars based on the Lambek calculus. Pentus proved in 1993 that they generate exactly the class of context-free languages without the empty word. In this paper, we study categorial grammars based on the Lambek calculus with the permutation rule LP. Of particular interest is the product-free fragment of LP called the Lambek-van Benthem calculus LBC. Buszkowski in his 1984 paper conjectured that grammars based on the Lambek-van Benthem calculus (LBC-grammars (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
1 — 50 / 1000