12 found
Order:
  1. Static and Dynamic Vector Semantics for Lambda Calculus Models of Natural Language.Mehrnoosh Sadrzadeh & Reinhard Muskens - 2018 - Journal of Language Modelling 6 (2):319-351.
    Vector models of language are based on the contextual aspects of language, the distributions of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, compositional properties of words and how they compose to form sentences. In the truth conditional approach, the denotation of a sentence determines its truth conditions, which can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In the vector models, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  2.  17
    Algebraic Semantics and Model Completeness for Intuitionistic Public Announcement Logic.Minghui Ma, Alessandra Palmigiano & Mehrnoosh Sadrzadeh - 2014 - Annals of Pure and Applied Logic 165 (4):963-995.
    In the present paper, we start studying epistemic updates using the standard toolkit of duality theory. We focus on public announcements, which are the simplest epistemic actions, and hence on Public Announcement Logic without the common knowledge operator. As is well known, the epistemic action of publicly announcing a given proposition is semantically represented as a transformation of the model encoding the current epistemic setup of the given agents; the given current model being replaced with its submodel relativized to the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  3. Context Update for Lambdas and Vectors.Reinhard Muskens & Mehrnoosh Sadrzadeh - 2016 - In Maxime Amblard, Philippe de Groote, Sylvain Pogodalla & Christian Retoré (eds.), Logical Aspects of Computational Linguistics. Celebrating 20 Years of LACL (1996--2016). Berlin Heidelberg: Springer. pp. 247--254.
    Vector models of language are based on the contextual aspects of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, the denotations of phrases, and their compositional properties. In the latter approach the denotation of a sentence determines its truth conditions and can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In this short paper, we develop a vector semantics for language based (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  4.  26
    Lambek Vs. Lambek: Functorial Vector Space Semantics and String Diagrams for Lambek Calculus.Bob Coecke, Edward Grefenstette & Mehrnoosh Sadrzadeh - 2013 - Annals of Pure and Applied Logic 164 (11):1079-1100.
    The Distributional Compositional Categorical model is a mathematical framework that provides compositional semantics for meanings of natural language sentences. It consists of a computational procedure for constructing meanings of sentences, given their grammatical structure in terms of compositional type-logic, and given the empirically derived meanings of their words. For the particular case that the meaning of words is modelled within a distributional vector space model, its experimental predictions, derived from real large scale data, have outperformed other empirically validated methods that (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  5.  78
    Positive Logic with Adjoint Modalities: Proof Theory, Semantics, and Reasoning About Information: Positive Logic with Adjoint Modalities.Mehrnoosh Sadrzadeh - 2010 - Review of Symbolic Logic 3 (3):351-373.
    We consider a simple modal logic whose nonmodal part has conjunction and disjunction as connectives and whose modalities come in adjoint pairs, but are not in general closure operators. Despite absence of negation and implication, and of axioms corresponding to the characteristic axioms of _T_, _S4_, and _S5_, such logics are useful, as shown in previous work by Baltag, Coecke, and the first author, for encoding and reasoning about information and misinformation in multiagent systems. For the propositional-only fragment of such (...)
    Direct download (9 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  6.  14
    Ockham’s Razor and Reasoning About Information Flow.Mehrnoosh Sadrzadeh - 2009 - Synthese 167 (2):391-408.
    What is the minimal algebraic structure to reason about information flow? Do we really need the full power of Boolean algebras with co-closure and de Morgan dual operators? How much can we weaken and still be able to reason about multi-agent scenarios in a tidy compositional way? This paper provides some answers.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  7.  89
    Ockham’s Razor and Reasoning About Information Flow.Mehrnoosh Sadrzadeh - 2009 - Synthese 167 (2):391 - 408.
    What is the minimal algebraic structure to reason about information flow? Do we really need the full power of Boolean algebras with co-closure and de Morgan dual operators? How much can we weaken and still be able to reason about multi-agent scenarios in a tidy compositional way? This paper provides some answers.
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  8. Fuzzy Generalised Quantifiers for Natural Language in Categorical Compositional Distributional Semantics.Mǎtej Dostál, Mehrnoosh Sadrzadeh & Gijs Wijnholds - 2021 - In Mojtaba Mojtahedi, Shahid Rahman & MohammadSaleh Zarepour (eds.), Mathematics, Logic, and Their Philosophies: Essays in Honour of Mohammad Ardeshir. Springer. pp. 135-160.
    Recent work on compositional distributional models shows that bialgebras over finite dimensional vector spaces can be applied to treat generalised quantifiersGeneralised quantifiers for natural language. That technique requires one to construct the vector space over powersets, and therefore is computationally costly. In this paper, we overcome this problem by considering fuzzy versions of quantifiers along the lines of ZadehZadeh, L. A., within the category of many valued relationsMany valued relations. We show that this category is a concrete instantiation of the (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  9. Pregroup Grammars, Their Syntax and Semantics.Mehrnoosh Sadrzadeh - 2021 - In Claudia Casadio & Philip J. Scott (eds.), Joachim Lambek: The Interplay of Mathematics, Logic, and Linguistics. Springer Verlag. pp. 347-376.
    Pregroup grammars were developed in 1999 and stayed Lambek’s preferred algebraic model of grammar. The set-theoretic semantics of pregroups, however, faces an ambiguity problem. In his latest book, Lambek suggests that this problem might be overcome using finite dimensional vector spaces rather than sets. What is the right notion of composition in this setting, direct sum or tensor product of spaces?
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  10.  15
    A Type-Driven Vector Semantics for Ellipsis with Anaphora Using Lambek Calculus with Limited Contraction.Gijs Wijnholds & Mehrnoosh Sadrzadeh - 2019 - Journal of Logic, Language and Information 28 (2):331-358.
    We develop a vector space semantics for verb phrase ellipsis with anaphora using type-driven compositional distributional semantics based on the Lambek calculus with limited contraction of Jäger. Distributional semantics has a lot to say about the statistical collocation based meanings of content words, but provides little guidance on how to treat function words. Formal semantics on the other hand, has powerful mechanisms for dealing with relative pronouns, coordinators, and the like. Type-driven compositional distributional semantics brings these two models together. We (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  11.  50
    Semantic Vector Models and Functional Models for Pregroup Grammars.Anne Preller & Mehrnoosh Sadrzadeh - 2011 - Journal of Logic, Language and Information 20 (4):419-443.
    We show that vector space semantics and functional semantics in two-sorted first order logic are equivalent for pregroup grammars. We present an algorithm that translates functional expressions to vector expressions and vice-versa. The semantics is compositional, variable free and invariant under change of order or multiplicity. It includes the semantic vector models of Information Retrieval Systems and has an interior logic admitting a comprehension schema. A sentence is true in the interior logic if and only if the ‘usual’ first order (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  12.  2
    Incremental Composition in Distributional Semantics.Matthew Purver, Mehrnoosh Sadrzadeh, Ruth Kempson, Gijs Wijnholds & Julian Hough - 2021 - Journal of Logic, Language and Information 30 (2):379-406.
    Despite the incremental nature of Dynamic Syntax, the semantic grounding of it remains that of predicate logic, itself grounded in set theory, so is poorly suited to expressing the rampantly context-relative nature of word meaning, and related phenomena such as incremental judgements of similarity needed for the modelling of disambiguation. Here, we show how DS can be assigned a compositional distributional semantics which enables such judgements and makes it possible to incrementally disambiguate language constructs using vector space semantics. Building on (...)
    Direct download (3 more)  
    Translate
     
     
    Export citation  
     
    Bookmark