Rock classification can enhance fracture treatment design for successful field developments in organic-shale reservoirs. The petrophysical and elastic properties of formations are important to consider when selecting the best candidate zones for fracture treatment. Rock classification techniques based on well logs can be advantageous compared to conventional ones based on cores, and they enable depth-by-depth formation characterization. We developed and evaluated three rock classification techniques in organic-shale formations that incorporate well logs and well-log-based estimates of elastic properties, petrophysical properties, mineralogy, (...) and organic richness. The three rock classification techniques include a 3D crossplot analysis of organic richness, volumetric concentrations of minerals, and rock brittleness index, an unsupervised artificial neural network, built from an input of well logs, and an unsupervised ANN, constructed using an input of well-log-based estimates of petrophysical, compositional, and elastic properties. A so-called self-consistent approximation rock-physics model is used to estimate elastic rock properties. This model enables assessment of the elastic properties based on the well-log-derived estimates of mineralogy and shapes of rock components, in the absence of acoustic-wave velocity logs. Finally, we apply the three proposed techniques to the Haynesville Shale for rock classification. We verify the identified rock types using thin-section images and previously identified lithofacies. We determined that well logs can be directly used for rock classification instead of petrophysical, compositional, and elastic properties obtained from well-log interpretation. Direct use of well logs, instead of well-log-derived properties, can reduce uncertainty associated with the physical models used to estimate elastic moduli and petrophysical/compositional properties. The three proposed well-log-based rock classification techniques can potentially enhance fracture treatment for production from complex organic-shale reservoirs through detecting the best candidate zones for fracture treatment and optimizing the number of required fracture stages. (shrink)
Carbonate formations consist of a wide range of pore types with different shapes, pore-throat sizes, and varying levels of pore-network connectivity. Such heterogeneous pore-network properties affect the fluid flow in the formation. However, characterizing pore-network properties in carbonate formations is challenging due to the heterogeneity at different scales and complex pore structure of carbonate rocks. We have developed an integrated technique for multiscale characterization of carbonate pore structure based on mercury injection capillary pressure measurements, X-ray micro-computed tomography 3D rock images, (...) and well logs. We have determined pore types based on the pore-throat radius distributions obtained from MICP measurements. We developed a new method for improved assessment of effective porosity and permeability in the well-log domain using pore-scale numerical simulations of fluid flow and electric current flow in 3D micro-CT core images obtained in each pore type. Finally, we conducted petrophysical rock classification based on the depth-by-depth estimates of effective porosity, permeability, volumetric concentrations of minerals, and pore types using an unsupervised artificial neural network. We have successfully applied the proposed technique to three wells in the Scurry Area Canyon Reef Operators Committee Unit. Our results find that electrical resistivity measurements can be used for reliable characterization of pore structure and assessment of effective porosity and permeability in carbonate formations. The estimates of permeability in the well-log domain were cross-validated using the available core measurements. We have observed a 34% improvement in relative errors in well-log-based estimates of permeability, as compared with the core-based porosity-permeability models. (shrink)
Vector models of language are based on the contextual aspects of language, the distributions of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, compositional properties of words and how they compose to form sentences. In the truth conditional approach, the denotation of a sentence determines its truth conditions, which can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In the vector models, (...) the degree of co-occurrence of words in context determines how similar the meanings of words are. In this paper, we put these two models together and develop a vector semantics for language based on the simply typed lambda calculus models of natural language. We provide two types of vector semantics: a static one that uses techniques familiar from the truth conditional tradition and a dynamic one based on a form of dynamic interpretation inspired by Heim’s context change potentials. We show how the dynamic model can be applied to entailment between a corpus and a sentence and provide examples. (shrink)
Vector models of language are based on the contextual aspects of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, the denotations of phrases, and their compositional properties. In the latter approach the denotation of a sentence determines its truth conditions and can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In this short paper, we develop a vector semantics for language based (...) on the simply typed lambda calculus. Our semantics uses techniques familiar from the truth conditional tradition and is based on a form of dynamic interpretation inspired by Heim's context updates. (shrink)
We consider a simple modal logic whose nonmodal part has conjunction and disjunction as connectives and whose modalities come in adjoint pairs, but are not in general closure operators. Despite absence of negation and implication, and of axioms corresponding to the characteristic axioms of _T_, _S4_, and _S5_, such logics are useful, as shown in previous work by Baltag, Coecke, and the first author, for encoding and reasoning about information and misinformation in multiagent systems. For the propositional-only fragment of such (...) a dynamic epistemic logic, we present an algebraic semantics, using lattices with agent-indexed families of adjoint pairs of operators, and a cut-free sequent calculus. The calculus exploits operators on sequents, in the style of “nested” or “tree-sequent” calculi; cut-admissibility is shown by constructive syntactic methods. The applicability of the logic is illustrated by reasoning about the muddy children puzzle, for which the calculus is augmented with extra rules to express the facts of the muddy children scenario. (shrink)
What is the minimal algebraic structure to reason about information flow? Do we really need the full power of Boolean algebras with co-closure and de Morgan dual operators? How much can we weaken and still be able to reason about multi-agent scenarios in a tidy compositional way? This paper provides some answers.
What is the minimal algebraic structure to reason about information flow? Do we really need the full power of Boolean algebras with co-closure and de Morgan dual operators? How much can we weaken and still be able to reason about multi-agent scenarios in a tidy compositional way? This paper provides some answers.
Pregroup grammars were developed in 1999 and stayed Lambek’s preferred algebraic model of grammar. The set-theoretic semantics of pregroups, however, faces an ambiguity problem. In his latest book, Lambek suggests that this problem might be overcome using finite dimensional vector spaces rather than sets. What is the right notion of composition in this setting, direct sum or tensor product of spaces?
We develop a vector space semantics for verb phrase ellipsis with anaphora using type-driven compositional distributional semantics based on the Lambek calculus with limited contraction of Jäger. Distributional semantics has a lot to say about the statistical collocation based meanings of content words, but provides little guidance on how to treat function words. Formal semantics on the other hand, has powerful mechanisms for dealing with relative pronouns, coordinators, and the like. Type-driven compositional distributional semantics brings these two models together. We (...) review previous compositional distributional models of relative pronouns, coordination and a restricted account of ellipsis in the DisCoCat framework of Coecke et al. :1079–1100, 2013). We show how DisCoCat cannot deal with general forms of ellipsis, which rely on copying of information, and develop a novel way of connecting typelogical grammar to distributional semantics by assigning vector interpretable lambda terms to derivations of LCC in the style of Muskens and Sadrzadeh Logical aspects of computational linguistics, Springer, Berlin, 2016). What follows is an account of ellipsis in which word meanings can be copied: the meaning of a sentence is now a program with non-linear access to individual word embeddings. We present the theoretical setting, work out examples, and demonstrate our results with a state of the art distributional model on an extended verb disambiguation dataset. (shrink)
Recent work on compositional distributional models shows that bialgebras over finite dimensional vector spaces can be applied to treat generalised quantifiersGeneralised quantifiers for natural language. That technique requires one to construct the vector space over powersets, and therefore is computationally costly. In this paper, we overcome this problem by considering fuzzy versions of quantifiers along the lines of ZadehZadeh, L. A., within the category of many valued relationsMany valued relations. We show that this category is a concrete instantiation of the (...) compositional distributional model. We show that the semantics obtained in this model is equivalent to the semantics of the fuzzy quantifiers of ZadehZadeh, L. A.. As a result, we are now able to treat fuzzy quantification without requiring a powerset construction. (shrink)
We show that vector space semantics and functional semantics in two-sorted first order logic are equivalent for pregroup grammars. We present an algorithm that translates functional expressions to vector expressions and vice-versa. The semantics is compositional, variable free and invariant under change of order or multiplicity. It includes the semantic vector models of Information Retrieval Systems and has an interior logic admitting a comprehension schema. A sentence is true in the interior logic if and only if the ‘usual’ first order (...) formula translating the sentence holds. The examples include negation, universal quantifiers and relative pronouns. (shrink)
In the present paper, we start studying epistemic updates using the standard toolkit of duality theory. We focus on public announcements, which are the simplest epistemic actions, and hence on Public Announcement Logic without the common knowledge operator. As is well known, the epistemic action of publicly announcing a given proposition is semantically represented as a transformation of the model encoding the current epistemic setup of the given agents; the given current model being replaced with its submodel relativized to the (...) announced proposition. We dually characterize the associated submodel-injection map as a certain pseudo-quotient map between the complex algebras respectively associated with the given model and with its relativized submodel. As is well known, these complex algebras are complete atomic BAOs . The dual characterization we provide naturally generalizes to much wider classes of algebras, which include, but are not limited to, arbitrary BAOs and arbitrary modal expansions of Heyting algebras . Thanks to this construction, the benefits and the wider scope of applications given by a point-free, intuitionistic theory of epistemic updates are made available. As an application of this dual characterization, we axiomatize the intuitionistic analogue of PAL, which we refer to as IPAL, prove soundness and completeness of IPAL w.r.t. both algebraic and relational models, and show that the well known Muddy Children Puzzle can be formalized in IPAL. (shrink)
Despite the incremental nature of Dynamic Syntax, the semantic grounding of it remains that of predicate logic, itself grounded in set theory, so is poorly suited to expressing the rampantly context-relative nature of word meaning, and related phenomena such as incremental judgements of similarity needed for the modelling of disambiguation. Here, we show how DS can be assigned a compositional distributional semantics which enables such judgements and makes it possible to incrementally disambiguate language constructs using vector space semantics. Building on (...) a proposal in our previous work, we implement and evaluate our model on real data, showing that it outperforms a commonly used additive baseline. In conclusion, we argue that these results set the ground for an account of the non-determinism of lexical content, in which the nature of word meaning is its dependence on surrounding context for its construal. (shrink)
The Distributional Compositional Categorical model is a mathematical framework that provides compositional semantics for meanings of natural language sentences. It consists of a computational procedure for constructing meanings of sentences, given their grammatical structure in terms of compositional type-logic, and given the empirically derived meanings of their words. For the particular case that the meaning of words is modelled within a distributional vector space model, its experimental predictions, derived from real large scale data, have outperformed other empirically validated methods that (...) could build vectors for a full sentence. This success can be attributed to a conceptually motivated mathematical underpinning, something which the other methods lack, by integrating qualitative compositional type-logic and quantitative modelling of meaning within a category-theoretic mathematical framework. The type-logic used in the DisCoCat model is Lambekʼs pregroup grammar. Pregroup types form a posetal compact closed category, which can be passed, in a functorial manner, on to the compact closed structure of vector spaces, linear maps and tensor product. The diagrammatic versions of the equational reasoning in compact closed categories can be interpreted as the flow of word meanings within sentences. Pregroups simplify Lambekʼs previous type-logic, the Lambek calculus. The latter and its extensions have been extensively used to formalise and reason about various linguistic phenomena. Hence, the apparent reliance of the DisCoCat on pregroups has been seen as a shortcoming. This paper addresses this concern, by pointing out that one may as well realise a functorial passage from the original type-logic of Lambek, a monoidal bi-closed category, to vector spaces, or to any other model of meaning organised within a monoidal bi-closed category. The corresponding string diagram calculus, due to Baez and Stay, now depicts the flow of word meanings, and also reflects the structure of the parse trees of the Lambek calculus. (shrink)