Results for 'Vector model'

994 found
Order:
  1.  58
    Semantic Vector Models and Functional Models for Pregroup Grammars.Anne Preller & Mehrnoosh Sadrzadeh - 2011 - Journal of Logic, Language and Information 20 (4):419-443.
    We show that vector space semantics and functional semantics in two-sorted first order logic are equivalent for pregroup grammars. We present an algorithm that translates functional expressions to vector expressions and vice-versa. The semantics is compositional, variable free and invariant under change of order or multiplicity. It includes the semantic vector models of Information Retrieval Systems and has an interior logic admitting a comprehension schema. A sentence is true in the interior logic if and only if the (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  2.  15
    A vector model for psychophysical judgment.John Ross & Vincent di Lollo - 1968 - Journal of Experimental Psychology 77 (3p2):1.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  3.  14
    Multidimensional vector model of stimulus–response compatibility.Motonori Yamaguchi & Robert W. Proctor - 2012 - Psychological Review 119 (2):272-303.
  4. Expanding the vector model for dispositionalist approaches to causation.Joseph A. Baltimore - 2019 - Synthese 196 (12):5083-5098.
    Neuron diagrams are heavily employed in academic discussions of causation. Stephen Mumford and Rani Lill Anjum, however, offer an alternative approach employing vector diagrams, which this paper attempts to develop further. I identify three ways in which dispositionalists have taken the activities of powers to be related: stimulation, mutual manifestation, and contribution combination. While Mumford and Anjum do provide resources for representing contribution combination, which might be sufficient for their particular brand of dispositionalism, I argue that those resources are (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  5.  18
    Model-theory of vector-spaces over unspecified fields.David Pierce - 2009 - Archive for Mathematical Logic 48 (5):421-436.
    Vector spaces over unspecified fields can be axiomatized as one-sorted structures, namely, abelian groups with the relation of parallelism. Parallelism is binary linear dependence. When equipped with the n-ary relation of linear dependence for some positive integer n, a vector-space is existentially closed if and only if it is n-dimensional over an algebraically closed field. In the signature with an n-ary predicate for linear dependence for each positive integer n, the theory of infinite-dimensional vector spaces over algebraically (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  6. Static and dynamic vector semantics for lambda calculus models of natural language.Mehrnoosh Sadrzadeh & Reinhard Muskens - 2018 - Journal of Language Modelling 6 (2):319-351.
    Vector models of language are based on the contextual aspects of language, the distributions of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, compositional properties of words and how they compose to form sentences. In the truth conditional approach, the denotation of a sentence determines its truth conditions, which can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In the (...) models, the degree of co-occurrence of words in context determines how similar the meanings of words are. In this paper, we put these two models together and develop a vector semantics for language based on the simply typed lambda calculus models of natural language. We provide two types of vector semantics: a static one that uses techniques familiar from the truth conditional tradition and a dynamic one based on a form of dynamic interpretation inspired by Heim’s context change potentials. We show how the dynamic model can be applied to entailment between a corpus and a sentence and provide examples. (shrink)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  7. Vector space models of lexical meaning.Stephen Clark - 2015 - In Shalom Lappin & Chris Fox (eds.), Handbook of Contemporary Semantic Theory. Wiley-Blackwell.
     
    Export citation  
     
    Bookmark   4 citations  
  8.  42
    Vector subtraction implemented neurally: A neurocomputational model of some sequential cognitive and conscious processes.John Bickle, Cindy Worley & Marica Bernstein - 2000 - Consciousness and Cognition 9 (1):117-144.
    Although great progress in neuroanatomy and physiology has occurred lately, we still cannot go directly to those levels to discover the neural mechanisms of higher cognition and consciousness. But we can use neurocomputational methods based on these details to push this project forward. Here we describe vector subtraction as an operation that computes sequential paths through high-dimensional vector spaces. Vector-space interpretations of network activity patterns are a fruitful resource in recent computational neuroscience. Vector subtraction also appears (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  9.  90
    Vector space semantics: A model-theoretic analysis of locative prepositions. [REVIEW]Joost Zwarts & Yoad Winter - 2000 - Journal of Logic, Language and Information 9 (2):169-211.
    This paper introduces a compositional semantics of locativeprepositional phrases which is based on a vector space ontology.Model-theoretic properties of prepositions like monotonicity andconservativity are defined in this system in a straightforward way.These notions are shown to describe central inferences with spatialexpressions and to account for the grammaticality of prepositionmodification. Model-theoretic constraints on the set of possibleprepositions in natural language are specified, similar to the semanticuniversals of Generalized Quantifier Theory.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   17 citations  
  10. Spoils to the Vector - How to model causes if you are a realist about powers.Stephen Mumford & Rani Lill Anjum - 2011 - The Monist 94 (1):54-80.
    A standard way of representing causation is with neuron diagrams. This has become popular since the influential work of David Lewis. But it should not be assumed that such representations are metaphysically neutral and amenable to any theory of causation. On the contrary, this way of representing causation already makes several Humean assumptions about what causation is, and which suit Lewis’s programme of Humean Supervenience. An alternative of a vector diagram is better suited for a powers ontology. Causation should (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   14 citations  
  11.  17
    Vector Autoregressive Hierarchical Hidden Markov Models for Extracting Finger Movements Using Multichannel Surface EMG Signals.Nebojša Malešević, Dimitrije Marković, Gunter Kanitz, Marco Controzzi, Christian Cipriani & Christian Antfolk - 2018 - Complexity 2018:1-12.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  12.  18
    Killing vectors in cosmological models with rotation.Thoralf Chrobok - 2000 - In M. Scherfner, T. Chrobok & M. Shefaat (eds.), Colloquium on Cosmic Rotation. Wissenschaft Und Technik Verlag. pp. 1--105.
  13.  24
    Reasoning with vectors: A continuous model for fast robust inference.Dominic Widdows & Trevor Cohen - 2015 - Logic Journal of the IGPL 23 (2):141-173.
    This article describes the use of continuous vector space models for reasoning with a formal knowledge base. The practical significance of these models is that they support fast, approximate but robust inference and hypothesis generation, which is complementary to the slow, exact, but sometimes brittle behaviour of more traditional deduction engines such as theorem provers.The article explains the way logical connectives can be used in semantic vector models, and summarizes the development of Predication-based Semantic Indexing, which involves the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  14.  13
    Site Characterization Model Using Support Vector Machine and Ordinary Kriging.Sarat Das & Pijush Samui - 2011 - Journal of Intelligent Systems 20 (3):261-278.
    In the present study, ordinary kriging and support vector machine have been used to develop three dimensional site characterization model of an alluvial site based on standard penetration test results. The SVM is a novel type of learning machine based on statistical learning theory, uses regression technique by introducing ε-insensitive loss function has been adopted. The knowledge of the semivariogram of the SPT values is used in the ordinary kriging method to predict the N values at any point (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  15.  13
    Vector spaces with a union of independent subspaces.Alessandro Berarducci, Marcello Mamino & Rosario Mennuni - 2024 - Archive for Mathematical Logic 63 (3):499-507.
    We study the theory of K-vector spaces with a predicate for the union X of an infinite family of independent subspaces. We show that if K is infinite then the theory is complete and admits quantifier elimination in the language of K-vector spaces with predicates for the n-fold sums of X with itself. If K is finite this is no longer true, but we still have that a natural completion is near-model-complete.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  16.  5
    Application of Bayesian Vector Autoregressive Model in Regional Economic Forecast.Jinghao Ma, Yujie Shang & Hongyan Zhang - 2021 - Complexity 2021:1-10.
    The Bayesian vector autoregressive model introduces the statistical properties of variables as the prior distribution of the parameters into the traditional vector autoregressive model, which can overcome the problem of too little freedom. The BVAR model established in this paper can overcome the problem of short time series data by using prior statistical information. In theory, it should have a good effect in China’s regional economic forecasting. Most regional forecasting model literature lacks out-of-sample forecasting (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  17.  28
    A two-patch model of gambian sleeping sickness: Application to vector control strategies in a village and plantations.Karine Chalvet-Monfray, Marc Artzrouni, Jean-Paul Gouteux, Pierre Auger & Philippe Sabatier - 1998 - Acta Biotheoretica 46 (3):207-222.
    A compartmental model is described for the spread of Gambian sleeping sickness in a spatially heterogeneous environment in which vector and human populations migrate between two "patches": the village and the plantations. The number of equilibrium points depends on two "summary parameters": gr the proportion removed among human infectives, and R0, the basic reproduction number. The origin is stable for R0 1. Control strategies are assessed by studying the mix of vector control between the two patches that (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  18.  14
    A support vector regression model for time series forecasting of the COMEX copper spot price.Esperanza García-Gonzalo, Paulino José García Nieto, Javier Gracia Rodríguez, Fernando Sánchez Lasheras & Gregorio Fidalgo Valverde - 2023 - Logic Journal of the IGPL 31 (4):775-784.
    The price of copper is unstable but it is considered an important indicator of the global economy. Changes in the price of copper point to higher global growth or an impending recession. In this work, the forecasting of the spot prices of copper from the New York Commodity Exchange is studied using a machine learning method, support vector regression coupled with different model schemas (recursive, direct and hybrid multi-step). Using these techniques, three different time series analyses are built (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  19. Rotated hyperbola model for smooth support vector machine for classification.En Wang - 2018 - Journal of China Universities of Posts and Telecommunications 25 (4).
    This article puts forward a novel smooth rotated hyperbola model for support vector machine (RHSSVM) for classification. As is well known, the Support vector machine (SVM) is based on Statistical Learning Theory and performs its high precision on data classification. However, the objective function is non-differentiable at the zero point. Therefore the fast algorithms cannot be used to train and test the SVM. To deal with it, the proposed method is based on the approximation property of the (...)
     
    Export citation  
     
    Bookmark  
  20.  91
    On vectorizations of unary generalized quantifiers.Kerkko Luosto - 2012 - Archive for Mathematical Logic 51 (3):241-255.
    Vectorization of a class of structures is a natural notion in finite model theory. Roughly speaking, vectorizations allow tuples to be treated similarly to elements of structures. The importance of vectorizations is highlighted by the fact that if the complexity class PTIME corresponds to a logic with reasonable syntax, then it corresponds to a logic generated via vectorizations by a single generalized quantifier (Dawar in J Log Comput 5(2):213–226, 1995). It is somewhat surprising, then, that there have been few (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  21. Towards a cognitive model of genre: Genre as a vector categorization of film: Czech and slovak papers on semiotics and communication.Vlastimil Zuska - 2000 - In Bernard Elevitch (ed.), Theoria. Charlottesville: Philosophy Doc Ctr. pp. 15--39.
    No categories
     
    Export citation  
     
    Bookmark  
  22.  82
    Parameter dependence and outcome dependence in dynamical models for state vector reduction.G. C. Ghirardi, R. Grassi, J. Butterfield & G. N. Fleming - 1993 - Foundations of Physics 23 (3):341-364.
    We apply the distinction between parameter independence and outcome independence to the linear and nonlinear models of a recent nonrelativistic theory of continuous state vector reduction. We show that in the nonlinear model there is a set of realizations of the stochastic process that drives the state vector reduction for which parameter independence is violated for parallel spin components in the EPR-Bohm setup. Such a set has an appreciable probability of occurrence (≈ 1/2). On the other hand, (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  23.  14
    Using Vector Autoregression Modeling to Reveal Bidirectional Relationships in Gender/Sex-Related Interactions in Mother–Infant Dyads.Elizabeth G. Eason, Nicole S. Carver, Damian G. Kelty-Stephen & Anne Fausto-Sterling - 2020 - Frontiers in Psychology 11.
    Vector autoregression (VAR) modeling allows probing bidirectional relationships in gender/sex development and may support hypothesis testing following multi-modal data collection. We show VAR in three lights: supporting a hypothesis, rejecting a hypothesis, and opening up new questions. To illustrate these capacities of VAR, we reanalyzed longitudinal data that recorded dyadic mother-infant interactions for 15 boys and 15 girls aged 3 to 11 months of age. We examined monthly counts of 15 infant behaviors and 13 maternal behaviors (Seifert et al., (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  24.  6
    Identification of Accounting Fraud Based on Support Vector Machine and Logistic Regression Model.Rongyuan Qin - 2021 - Complexity 2021:1-11.
    The authenticity of the company’s accounting information is an important guarantee for the effective operation of the capital market. Accounting fraud is the tampering and distortion of the company’s public disclosure information. The continuous outbreak of fraud cases has dealt a heavy blow to the confidence of investors, shaken the credit foundation of the capital market, and hindered the healthy and stable development of the capital market. Therefore, it is of great theoretical and practical significance to carry out the research (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  25.  6
    Estimation of Daily Suspended Sediment Load Using a Novel Hybrid Support Vector Regression Model Incorporated with Observer-Teacher-Learner-Based Optimization Method.Siyamak Doroudi, Ahmad Sharafati & Seyed Hossein Mohajeri - 2021 - Complexity 2021:1-13.
    Predicting suspended sediment load in water resource management requires efficient and reliable predicted models. This study considers the support vector regression method to predict daily suspended sediment load. Since the SVR has unknown parameters, the observer-teacher-learner-based Optimization method is integrated with the SVR model to provide a novel hybrid predictive model. The SVR combined with the genetic algorithm is used as an alternative model. To explore the performance and application of the proposed models, five input combinations (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  26.  4
    Two-Stage Hybrid Machine Learning Model for High-Frequency Intraday Bitcoin Price Prediction Based on Technical Indicators, Variational Mode Decomposition, and Support Vector Regression.Samuel Asante Gyamerah - 2021 - Complexity 2021:1-15.
    Due to the inherent chaotic and fractal dynamics in the price series of Bitcoin, this paper proposes a two-stage Bitcoin price prediction model by combining the advantage of variational mode decomposition and technical analysis. VMD eliminates the noise signals and stochastic volatility in the price data by decomposing the data into variational mode functions, while technical analysis uses statistical trends obtained from past trading activity and price changes to construct technical indicators. The support vector regression accepts input from (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  27.  11
    GenAI Model Security.Ken Huang, Ben Goertzel, Daniel Wu & Anita Xie - 2024 - In Ken Huang, Yang Wang, Ben Goertzel, Yale Li, Sean Wright & Jyoti Ponnapalli (eds.), Generative AI Security: Theories and Practices. Springer Nature Switzerland. pp. 163-198.
    Safeguarding GenAI models against threats and aligning them with security requirements is imperative yet challenging. This chapter provides an overview of the security landscape for generative models. It begins by elucidating common vulnerabilities and attack vectors, including adversarial attacks, model inversion, backdoors, data extraction, and algorithmic bias. The practical implications of these threats are discussed, spanning domains like finance, healthcare, and content creation. The narrative then shifts to exploring mitigation strategies and innovative security paradigms. Differential privacy, blockchain-based provenance, quantum-resistant (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  28.  8
    Probing the Representational Structure of Regular Polysemy via Sense Analogy Questions: Insights from Contextual Word Vectors.Jiangtian Li & Blair C. Armstrong - 2024 - Cognitive Science 48 (3):e13416.
    Regular polysemes are sets of ambiguous words that all share the same relationship between their meanings, such as CHICKEN and LOBSTER both referring to an animal or its meat. To probe how a distributional semantic model, here exemplified by bidirectional encoder representations from transformers (BERT), represents regular polysemy, we analyzed whether its embeddings support answering sense analogy questions similar to “is the mapping between CHICKEN (as an animal) and CHICKEN (as a meat) similar to that which maps between LOBSTER (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  29.  18
    Parallelograms revisited: Exploring the limitations of vector space models for simple analogies.Joshua C. Peterson, Dawn Chen & Thomas L. Griffiths - 2020 - Cognition 205 (C):104440.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  30.  18
    On vector spaces over specific fields without choice.Paul Howard & Eleftherios Tachtsis - 2013 - Mathematical Logic Quarterly 59 (3):128-146.
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  31.  39
    The Operators of Vector Logic.Eduardo Mizraji - 1996 - Mathematical Logic Quarterly 42 (1):27-40.
    Vector logic is a mathematical model of the propositional calculus in which the logical variables are represented by vectors and the logical operations by matrices. In this framework, many tautologies of classical logic are intrinsic identities between operators and, consequently, they are valid beyond the bivalued domain. The operators can be expressed as Kronecker polynomials. These polynomials allow us to show that many important tautologies of classical logic are generated from basic operators via the operations called Type I (...)
    Direct download  
     
    Export citation  
     
    Bookmark   5 citations  
  32. Second International Workshop on Bioinformatics Research and Applications (IWBRA06)-Extracting Protein-Protein Interactions from the Literature Using the Hidden Vector State Model.Deyu Zhou, Yulan He & Chee Keong Kwoh - 2006 - In O. Stock & M. Schaerf (eds.), Lecture Notes in Computer Science. Springer Verlag. pp. 718-725.
     
    Export citation  
     
    Bookmark  
  33.  12
    Probing Lexical Ambiguity: Word Vectors Encode Number and Relatedness of Senses.Barend Beekhuizen, Blair C. Armstrong & Suzanne Stevenson - 2021 - Cognitive Science 45 (5):e12943.
    Lexical ambiguity—the phenomenon of a single word having multiple, distinguishable senses—is pervasive in language. Both the degree of ambiguity of a word (roughly, its number of senses) and the relatedness of those senses have been found to have widespread effects on language acquisition and processing. Recently, distributional approaches to semantics, in which a word's meaning is determined by its contexts, have led to successful research quantifying the degree of ambiguity, but these measures have not distinguished between the ambiguity of words (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  34. Context Update for Lambdas and Vectors.Reinhard Muskens & Mehrnoosh Sadrzadeh - 2016 - In Maxime Amblard, Philippe de Groote, Sylvain Pogodalla & Christian Rétoré (eds.), Logical Aspects of Computational Linguistics. Celebrating 20 Years of LACL (1996–2016). Berlin, Germany: Springer. pp. 247--254.
    Vector models of language are based on the contextual aspects of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, the denotations of phrases, and their compositional properties. In the latter approach the denotation of a sentence determines its truth conditions and can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In this short paper, we develop a vector semantics for (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  35.  45
    Lambek vs. Lambek: Functorial vector space semantics and string diagrams for Lambek calculus.Bob Coecke, Edward Grefenstette & Mehrnoosh Sadrzadeh - 2013 - Annals of Pure and Applied Logic 164 (11):1079-1100.
    The Distributional Compositional Categorical model is a mathematical framework that provides compositional semantics for meanings of natural language sentences. It consists of a computational procedure for constructing meanings of sentences, given their grammatical structure in terms of compositional type-logic, and given the empirically derived meanings of their words. For the particular case that the meaning of words is modelled within a distributional vector space model, its experimental predictions, derived from real large scale data, have outperformed other empirically (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  36.  20
    A Type-Driven Vector Semantics for Ellipsis with Anaphora Using Lambek Calculus with Limited Contraction.Gijs Wijnholds & Mehrnoosh Sadrzadeh - 2019 - Journal of Logic, Language and Information 28 (2):331-358.
    We develop a vector space semantics for verb phrase ellipsis with anaphora using type-driven compositional distributional semantics based on the Lambek calculus with limited contraction of Jäger. Distributional semantics has a lot to say about the statistical collocation based meanings of content words, but provides little guidance on how to treat function words. Formal semantics on the other hand, has powerful mechanisms for dealing with relative pronouns, coordinators, and the like. Type-driven compositional distributional semantics brings these two models together. (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  37. Rotated Hyperbola Smooth Support Vector Regression.Q. Wu & En Wang - 2015 - Journal of Computational Information Systems 11 (5).
    ε-support vector regression (ε-SVR) as a constrained minimization problem can be converted into an unconstrained convex quadratic programming. Smooth function is the essence of the ε-smooth support vector regression (ε-SSVR). In this paper, a new rotated hyperbola function is proposed to replace the ε-insensitive loss function. The ε-rotated hyperbola smooth support vector regression (ε-RHSSVR) model is presented. Theoretical analyses show that the derived smooth function has improved approximation precision compared with other smooth approximate functions. The Newton-Armijo (...)
     
    Export citation  
     
    Bookmark  
  38.  20
    Strongly minimal fusions of vector spaces.Kitty L. Holland - 1997 - Annals of Pure and Applied Logic 83 (1):1-22.
    We provide a simple and transparent construction of Hrushovski's strongly minimal fusions in the case where the fused strongly minimal sets are vector spaces. We strengthen Hrushovski's result by showing that the strongly minimal fusions are model complete.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  39.  42
    Collapse of the state vector and psychokinetic effect.Helmut Schmidt - 1982 - Foundations of Physics 12 (6):565-581.
    Eugene Wigner and others have speculated that the “collapse of the state vector” during an observation might be a physically real process so that some modification of current quantum theory would be required to describe the interaction with a conscious observer appropriately.Experimental reports on the “psychokinetic effect” as a mental influence on the outcome of quantum jumps suggest that perhaps this effect might be vital for an understanding of the observer's role in quantum mechanics.Combining these two speculations we introduce (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  40.  37
    Grounding the Vector Space of an Octopus: Word Meaning from Raw Text.Anders Søgaard - 2023 - Minds and Machines 33 (1):33-54.
    Most, if not all, philosophers agree that computers cannot learn what words refers to from raw text alone. While many attacked Searle’s Chinese Room thought experiment, no one seemed to question this most basic assumption. For how can computers learn something that is not in the data? Emily Bender and Alexander Koller ( 2020 ) recently presented a related thought experiment—the so-called Octopus thought experiment, which replaces the rule-based interlocutor of Searle’s thought experiment with a neural language model. The (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  41. Neural signalling of probabilistic vectors.Nicholas Shea - 2014 - Philosophy of Science 81 (5):902-913.
    Recent work combining cognitive neuroscience with computational modelling suggests that distributed patterns of neural firing may represent probability distributions. This paper asks: what makes it the case that distributed patterns of firing, as well as carrying information about (correlating with) probability distributions over worldly parameters, represent such distributions? In examples of probabilistic population coding, it is the way information is used in downstream processing so as to lead to successful behaviour. In these cases content depends on factors beyond bare information, (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  42.  17
    Exploring What Is Encoded in Distributional Word Vectors: A Neurobiologically Motivated Analysis.Akira Utsumi - 2020 - Cognitive Science 44 (6):e12844.
    The pervasive use of distributional semantic models or word embeddings for both cognitive modeling and practical application is because of their remarkable ability to represent the meanings of words. However, relatively little effort has been made to explore what types of information are encoded in distributional word vectors. Knowing the internal knowledge embedded in word vectors is important for cognitive modeling using distributional semantic models. Therefore, in this paper, we attempt to identify the knowledge encoded in word vectors by conducting (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  43.  19
    Equilibria with vector-valued utilities and preference information. The analysis of a mixed duopoly.Amparo M. Mármol, Luisa Monroy, M. Ángeles Caraballo & Asunción Zapata - 2017 - Theory and Decision 83 (3):365-383.
    This paper deals with the equilibria of games when the agents have multiple objectives and, therefore, their utilities cannot be represented by a single value, but by a vector containing the various dimensions of the utility. Our approach allows the incorporation of partial information about the preferences of the agents into the model, and permits the identification of the set of equilibria in accordance with this information. We also propose an additional conservative criterion which can be applied in (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  44.  34
    Relativistic Dynamics of Vector Bosons in the Field of Gravitational Radiation.A. Balakin & V. Kurbanova - 2001 - Foundations of Physics 31 (7):1039-1049.
    We consider a model of the state evolution of relativistic vector bosons, which includes both the dynamical equations for the particle four-velocity and the equations for the polarization four-vector evolution in the field of a nonlinear plane gravitational wave. In addition to the gravitational minimal coupling, tidal forces linear in curvature tensor are suggested to drive the particle state evolution. The exact solutions of the evolutionary equations are obtained. Birefringence and tidal deviations from the geodesic motion are (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  45. Bézier Function Smooth Support Vector Regression.Q. Wu & En Wang - 2015 - ICIC Express Letters, Part B: Applications 6 (7).
    Smooth function is the essence of the ε-smooth support vector regression (ε-SSVR) which is an unconstrained convex quadratic programming. In this paper, Bézier function as a new smooth function is proposed to replace the ε-insensitive loss function of ε-SSVR with tolerating a smaller error in fitting given data sets linearly and nonlinearly. The ε-Bézier function smooth support vector regression (ε-BSSVR) model is presented. Theoretical analyses show that the derived smooth function has improved approximation precision compared with other (...)
     
    Export citation  
     
    Bookmark  
  46. Bezier Smooth Support Vector Classification.Q. Wu & En Wang - 2015 - Journal of Computational Information Systems 11 (12).
    A new smooth method for solving the support vector machine classification (SVC) is presented. Since the objective function of the unconstrained SVC is non-smooth, we apply the smooth technique and replace the SVC function with Bézier function and get a class of Bézier smooth support vector machines (BSSVM). The fast Newton-Armijo algorithm is used to solve the BSSVM. Theoretical analysis and numerical results illustrate that this smooth SVM model improves in efficiency and accuracy compared with other smooth (...)
     
    Export citation  
     
    Bookmark  
  47.  23
    Deep Reinforcement Learning for Vectored Thruster Autonomous Underwater Vehicle Control.Tao Liu, Yuli Hu & Hui Xu - 2021 - Complexity 2021:1-25.
    Autonomous underwater vehicles are widely used to accomplish various missions in the complex marine environment; the design of a control system for AUVs is particularly difficult due to the high nonlinearity, variations in hydrodynamic coefficients, and external force from ocean currents. In this paper, we propose a controller based on deep reinforcement learning in a simulation environment for studying the control performance of the vectored thruster AUV. RL is an important method of artificial intelligence that can learn behavior through trial-and-error (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  48.  1
    Beyond Standard Model Collider Phenomenology of Higgs Physics and Supersymmetry.Marc Christopher Thomas - 2016 - Cham: Imprint: Springer.
    This thesis studies collider phenomenology of physics beyond the Standard Model at the Large Hadron Collider (LHC). It also explores in detail advanced topics related to Higgs boson and supersymmetry - one of the most exciting and well-motivated streams in particle physics. In particular, it finds a very large enhancement of multiple Higgs boson production in vector-boson scattering when Higgs couplings to gauge bosons differ from those predicted by the Standard Model. The thesis demonstrates that due to (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  49. Model-checking CTL* over flat Presburger counter systems.Stéphane Demri, Alain Finkel, Valentin Goranko & Govert van Drimmelen - 2010 - Journal of Applied Non-Classical Logics 20 (4):313-344.
    This paper concerns model-checking of fragments and extensions of CTL* on infinite-state Presburger counter systems, where the states are vectors of integers and the transitions are determined by means of relations definable within Presburger arithmetic. In general, reachability properties of counter systems are undecidable, but we have identified a natural class of admissible counter systems (ACS) for which we show that the quantification over paths in CTL* can be simulated by quantification over tuples of natural numbers, eventually allowing translation (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  50.  13
    The Central Complex as a Potential Substrate for Vector Based Navigation.Florent Le Moël, Thomas Stone, Mathieu Lihoreau, Antoine Wystrach & Barbara Webb - 2019 - Frontiers in Psychology 10.
    Insects use path integration (PI) to maintain a home vector, but can also store and recall vector-memories that take them from home to a food location, and even allow them to take novel shortcuts between food locations. The neural circuit of the Central Complex (a brain area that receives compass and optic flow information) forms a plausible substrate for these behaviours. A recent model, grounded in neurophysiological and neuroanatomical data, can account for PI during outbound exploratory routes (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
1 — 50 / 994