Results for 'Statistical learning theory'

994 found
Order:
  1.  8
    Reliable Reasoning: Induction and Statistical Learning Theory.Gilbert Harman & Sanjeev Kulkarni - 2007 - Bradford.
    In _Reliable Reasoning_, Gilbert Harman and Sanjeev Kulkarni -- a philosopher and an engineer -- argue that philosophy and cognitive science can benefit from statistical learning theory, the theory that lies behind recent advances in machine learning. The philosophical problem of induction, for example, is in part about the reliability of inductive reasoning, where the reliability of a method is measured by its statistically expected percentage of errors -- a central topic in SLT. After discussing (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   37 citations  
  2.  5
    Falsificationism and Statistical Learning Theory: Comparing the Popper and Vapnik-Chervonenkis Dimensions.David Corfield, Bernhard Schölkopf & Vladimir Vapnik - 2009 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 40 (1):51-58.
    We compare Karl Popper’s ideas concerning the falsifiability of a theory with similar notions from the part of statistical learning theory known as VC-theory . Popper’s notion of the dimension of a theory is contrasted with the apparently very similar VC-dimension. Having located some divergences, we discuss how best to view Popper’s work from the perspective of statistical learning theory, either as a precursor or as aiming to capture a different (...) activity. (shrink)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  3. Statistical learning theory as a framework for the philosophy of induction.Gilbert Harman & Sanjeev Kulkarni - manuscript
    Statistical Learning Theory (e.g., Hastie et al., 2001; Vapnik, 1998, 2000, 2006) is the basic theory behind contemporary machine learning and data-mining. We suggest that the theory provides an excellent framework for philosophical thinking about inductive inference.
     
    Export citation  
     
    Bookmark   1 citation  
  4.  2
    Statistical learning theory applied to an instrumental avoidance situation.Arthur L. Brody - 1957 - Journal of Experimental Psychology 54 (4):240.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  5. Statistical Learning Theory: A Tutorial.Sanjeev R. Kulkarni & Gilbert Harman - 2011 - Wiley Interdisciplinary Reviews: Computational Statistics 3 (6):543-556.
    In this article, we provide a tutorial overview of some aspects of statistical learning theory, which also goes by other names such as statistical pattern recognition, nonparametric classification and estimation, and supervised learning. We focus on the problem of two-class pattern classification for various reasons. This problem is rich enough to capture many of the interesting aspects that are present in the cases of more than two classes and in the problem of estimation, and many (...)
     
    Export citation  
     
    Bookmark  
  6.  5
    Statistical learning theory, capacity, and complexity.Bernhard Schölkopf - 2003 - Complexity 8 (4):87-94.
  7.  13
    Spontaneous recovery and statistical learning theory.Lloyd E. Homme - 1956 - Journal of Experimental Psychology 51 (3):205.
  8.  3
    Nonrandom stimulus sampling in statistical learning theory.William F. Prokasy - 1961 - Psychological Review 68 (3):219-224.
  9.  6
    A stimulus-trace hypothesis for statistical learning theory.Robert S. Witte - 1959 - Journal of Experimental Psychology 57 (5):273.
  10.  22
    Testability and Ockham’s Razor: How Formal and Statistical Learning Theory Converge in the New Riddle of Induction.Daniel Steel - 2009 - Journal of Philosophical Logic 38 (5):471-489.
    Nelson Goodman's new riddle of induction forcefully illustrates a challenge that must be confronted by any adequate theory of inductive inference: provide some basis for choosing among alternative hypotheses that fit past data but make divergent predictions. One response to this challenge is to distinguish among alternatives by means of some epistemically significant characteristic beyond fit with the data. Statistical learning theory takes this approach by showing how a concept similar to Popper's notion of degrees of (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  11. Foundations of Statistical Learning Theory, 1. The Linear Model for Simple Learning.W. K. Estes & Patrick Suppes - 1959 - British Journal for the Philosophy of Science 10 (39):251-252.
     
    Export citation  
     
    Bookmark   1 citation  
  12.  4
    A test of a statistical learning theory model for two-choice behavior with double stimulus events.Norman H. Anderson & David A. Grant - 1957 - Journal of Experimental Psychology 54 (5):305.
  13.  76
    Simple Models in Complex Worlds: Occam’s Razor and Statistical Learning Theory.Falco J. Bargagli Stoffi, Gustavo Cevolani & Giorgio Gnecco - 2022 - Minds and Machines 32 (1):13-42.
    The idea that “simplicity is a sign of truth”, and the related “Occam’s razor” principle, stating that, all other things being equal, simpler models should be preferred to more complex ones, have been long discussed in philosophy and science. We explore these ideas in the context of supervised machine learning, namely the branch of artificial intelligence that studies algorithms which balance simplicity and accuracy in order to effectively learn about the features of the underlying domain. Focusing on statistical (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  14.  1
    Analysis of a verbal conditioning situation in terms of statistical learning theory.W. K. Estes & J. H. Straughan - 1954 - Journal of Experimental Psychology 47 (4):225.
  15.  27
    Testability and Ockham’s Razor: How Formal and Statistical Learning Theory Converge in the New Riddle of Induction. [REVIEW]Daniel Steel - 2009 - Journal of Philosophical Logic 38 (5):471 - 489.
    Nelson Goodman’s new riddle of induction forcefully illustrates a challenge that must be confronted by any adequate theory of inductive inference: provide some basis for choosing among alternative hypotheses that fit past data but make divergent predictions. One response to this challenge is to distinguish among alternatives by means of some epistemically significant characteristic beyond fit with the data. Statistical learning theory takes this approach by showing how a concept similar to Popper’s notion of degrees of (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  16.  13
    Mind changes and testability: How formal and statistical learning theory converge in the new Riddle of induction.Daniel Steel - manuscript
    This essay demonstrates a previously unnoticed connection between formal and statistical learning theory with regard to Nelson Goodman’s new riddle of induction. Discussions of Goodman’s riddle in formal learning theory explain how conjecturing “all green” before “all grue” can enhance efficient convergence to the truth, where efficiency is understood in terms of minimizing the maximum number of retractions or “mind changes.” Vapnik-Chervonenkis (VC) dimension is a central concept in statistical learning theory and (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  17. A statistical learning approach to a problem of induction.Kino Zhao - manuscript
    At its strongest, Hume's problem of induction denies the existence of any well justified assumptionless inductive inference rule. At the weakest, it challenges our ability to articulate and apply good inductive inference rules. This paper examines an analysis that is closer to the latter camp. It reviews one answer to this problem drawn from the VC theorem in statistical learning theory and argues for its inadequacy. In particular, I show that it cannot be computed, in general, whether (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  18.  19
    What Determines Visual Statistical Learning Performance? Insights From Information Theory.Noam Siegelman, Louisa Bogaerts & Ram Frost - 2019 - Cognitive Science 43 (12):e12803.
    In order to extract the regularities underlying a continuous sensory input, the individual elements constituting the stream have to be encoded and their transitional probabilities (TPs) should be learned. This suggests that variance in statistical learning (SL) performance reflects efficiency in encoding representations as well as efficiency in detecting their statistical properties. These processes have been taken to be independent and temporally modular, where first, elements in the stream are encoded into internal representations, and then the co‐occurrences (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  19.  3
    An analysis of the effect of nonreinforced trials in terms of statistical learning theory.Richard C. Atkinson - 1956 - Journal of Experimental Psychology 52 (1):28.
  20.  3
    An analysis of two-person game situations in terms of statistical learning theory.Richard C. Atkinson & Patrick Suppes - 1958 - Journal of Experimental Psychology 55 (4):369.
  21. Précis of Reliable Reasoning: Induction and Statistical Learning Theory.Gilbert Harman & Sanjeev Kulkarni - 2009 - Abstracta 5 (S3):5-9.
     
    Export citation  
     
    Bookmark  
  22.  12
    Redefining “Learning” in Statistical Learning: What Does an Online Measure Reveal About the Assimilation of Visual Regularities?Noam Siegelman, Louisa Bogaerts, Ofer Kronenfeld & Ram Frost - 2018 - Cognitive Science 42 (S3):692-727.
    From a theoretical perspective, most discussions of statistical learning have focused on the possible “statistical” properties that are the object of learning. Much less attention has been given to defining what “learning” is in the context of “statistical learning.” One major difficulty is that SL research has been monitoring participants’ performance in laboratory settings with a strikingly narrow set of tasks, where learning is typically assessed offline, through a set of two-alternative-forced-choice questions, (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  23.  8
    How Statistical Learning Can Play Well with Universal Grammar.Lisa S. Pearl - 2021 - In Nicholas Allott, Terje Lohndal & Georges Rey (eds.), A Companion to Chomsky. Wiley. pp. 267–286.
    A key motivation for Universal Grammar (UG) is developmental: UG can help children acquire the linguistic knowledge that they do as quickly as they do from the data that's available to them. Some of the most fruitful recent work in language acquisition has combined ideas about different hypothesis space building blocks with domain‐general statistical learning. Statistical learning can then provide a way to help navigate the hypothesis space in order to converge on the correct hypothesis. Reinforcement (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  24.  25
    Acquiring Complex Communicative Systems: Statistical Learning of Language and Emotion.Ashley L. Ruba, Seth D. Pollak & Jenny R. Saffran - 2022 - Topics in Cognitive Science 14 (3):432-450.
    In this article, we consider infants’ acquisition of foundational aspects of language and emotion through the lens of statistical learning. By taking a comparative developmental approach, we highlight ways in which the learning problems presented by input from these two rich communicative domains are both similar and different. Our goal is to encourage other scholars to consider multiple domains of human experience when developing theories in developmental cognitive science.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  25.  12
    Chunking Versus Transitional Probabilities: Differentiating Between Theories of Statistical Learning.Samantha N. Emerson & Christopher M. Conway - 2023 - Cognitive Science 47 (5):e13284.
    There are two main approaches to how statistical patterns are extracted from sequences: The transitional probability approach proposes that statistical learning occurs through the computation of probabilities between items in a sequence. The chunking approach, including models such as PARSER and TRACX, proposes that units are extracted as chunks. Importantly, the chunking approach suggests that the extraction of full units weakens the processing of subunits while the transitional probability approach suggests that both units and subunits should strengthen. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  26.  5
    Review of Gilbert Harman, Sanjeev Kulkarni, Reliable Reasoning: Induction and Statistical Learning Theory[REVIEW]Kevin Kelly - 2008 - Notre Dame Philosophical Reviews 2008 (3).
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  27.  30
    Tracking probabilistic truths: a logic for statistical learning.Alexandru Baltag, Soroush Rafiee Rad & Sonja Smets - 2021 - Synthese 199 (3-4):9041-9087.
    We propose a new model for forming and revising beliefs about unknown probabilities. To go beyond what is known with certainty and represent the agent’s beliefs about probability, we consider a plausibility map, associating to each possible distribution a plausibility ranking. Beliefs are defined as in Belief Revision Theory, in terms of truth in the most plausible worlds. We consider two forms of conditioning or belief update, corresponding to the acquisition of two types of information: learning observable evidence (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  28. Machine learning, inductive reasoning, and reliability of generalisations.Petr Spelda - 2020 - AI and Society 35 (1):29-37.
    The present paper shows how statistical learning theory and machine learning models can be used to enhance understanding of AI-related epistemological issues regarding inductive reasoning and reliability of generalisations. Towards this aim, the paper proceeds as follows. First, it expounds Price’s dual image of representation in terms of the notions of e-representations and i-representations that constitute subject naturalism. For Price, this is not a strictly anti-representationalist position but rather a dualist one (e- and i-representations). Second, the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  29.  8
    Statistical theory of distributional phenomena in learning.W. K. Estes - 1955 - Psychological Review 62 (5):369-377.
  30.  11
    Toward a statistical theory of learning.William K. Estes - 1950 - Psychological Review 57 (2):94-107.
  31. Statistical Machine Learning and the Logic of Scientific Discovery.Antonino Freno - 2009 - Iris. European Journal of Philosophy and Public Debate 1 (2):375-388.
    One important problem in the philosophy of science is whether there can be a normative theory of discovery, as opposed to a normative theory of justification. Although the possibility of developing a logic of scientific discovery has been often doubted by philosophers, it is particularly interesting to consider how the basic insights of a normative theory of discovery have been turned into an effective research program in computer science, namely the research field of machine learning. In (...)
     
    Export citation  
     
    Bookmark   1 citation  
  32.  10
    Statistical models of syntax learning and use.Mark Johnson & Stefan Riezler - 2002 - Cognitive Science 26 (3):239-253.
    This paper shows how to define probability distributions over linguistically realistic syntactic structures in a way that permits us to define language learning and language comprehension as statistical problems. We demonstrate our approach using lexical‐functional grammar (LFG), but our approach generalizes to virtually any linguistic theory. Our probabilistic models are maximum entropy models. In this paper we concentrate on statistical inference procedures for learning the parameters that define these probability distributions. We point out some of (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  33.  23
    Does statistics anxiety impact academic dishonesty? Academic challenges in the age of distance learning.Keren Grinautsky, Pnina Steinberger & Yovav Eshet - 2022 - International Journal for Educational Integrity 18 (1).
    This study discusses the mediating role of statistics anxiety and motivation in the relationship comprising academic dishonesty, personality traits, and previous academic achievements in three different learning environments. Self-determination theory provides a broad psychological framework for these phenomena. Data were collected from 649 bachelor-degree students in the Social Sciences in five Israeli academic institutions. Structural equation modelling was employed to investigate the research variables’ relationships. Findings indicate that statistics anxiety mediates the relationship between personality traits and academic dishonesty (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  34.  11
    Toward a statistical theory of learning.William K. Estes - 1994 - Psychological Review 101 (2):282-289.
  35.  16
    Testing Theories of Transfer Using Error Rate Learning Curves.Kenneth R. Koedinger, Michael V. Yudelson & Philip I. Pavlik - 2016 - Topics in Cognitive Science 8 (3):589-609.
    We analyze naturally occurring datasets from student use of educational technologies to explore a long-standing question of the scope of transfer of learning. We contrast a faculty theory of broad transfer with a component theory of more constrained transfer. To test these theories, we develop statistical models of them. These models use latent variables to represent mental functions that are changed while learning to cause a reduction in error rates for new tasks. Strong versions of (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  36.  30
    Rational Rules: Towards a Theory of Moral Learning.Shaun Nichols - 2021 - Oxford University Press.
    Rational Rules argues that moral learning can be understood in terms of general-purpose rational learning procedures. Nichols provides statistical learning accounts of some fundamental aspects of moral development, combining aspects of traditional empiricist and rationalist approaches.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  37.  12
    All Together Now: Concurrent Learning of Multiple Structures in an Artificial Language.Alexa R. Romberg & Jenny R. Saffran - 2013 - Cognitive Science 37 (7):1290-1320.
    Natural languages contain many layers of sequential structure, from the distribution of phonemes within words to the distribution of phrases within utterances. However, most research modeling language acquisition using artificial languages has focused on only one type of distributional structure at a time. In two experiments, we investigated adult learning of an artificial language that contains dependencies between both adjacent and non-adjacent words. We found that learners rapidly acquired both types of regularities and that the strength of the adjacent (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   14 citations  
  38.  12
    General properties of bayesian learning as statistical inference determined by conditional expectations.Zalán Gyenis & Miklós Rédei - 2017 - Review of Symbolic Logic 10 (4):719-755.
    We investigate the general properties of general Bayesian learning, where “general Bayesian learning” means inferring a state from another that is regarded as evidence, and where the inference is conditionalizing the evidence using the conditional expectation determined by a reference probability measure representing the background subjective degrees of belief of a Bayesian Agent performing the inference. States are linear functionals that encode probability measures by assigning expectation values to random variables via integrating them with respect to the probability (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   17 citations  
  39. Reliability in Machine Learning.Thomas Grote, Konstantin Genin & Emily Sullivan - 2024 - Philosophy Compass 19 (5):e12974.
    Issues of reliability are claiming center-stage in the epistemology of machine learning. This paper unifies different branches in the literature and points to promising research directions, whilst also providing an accessible introduction to key concepts in statistics and machine learning – as far as they are concerned with reliability.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  40. Learning the Meanings of Function Words From Grounded Language Using a Visual Question Answering Model.Eva Portelance, Michael C. Frank & Dan Jurafsky - 2024 - Cognitive Science 48 (5):e13448.
    Interpreting a seemingly simple function word like “or,” “behind,” or “more” can require logical, numerical, and relational reasoning. How are such words learned by children? Prior acquisition theories have often relied on positing a foundation of innate knowledge. Yet recent neural‐network‐based visual question answering models apparently can learn to use function words as part of answering questions about complex visual scenes. In this paper, we study what these models learn about function words, in the hope of better understanding how the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  41.  7
    Future Time Orientation and Learning Engagement Through the Lens of Self-Determination Theory for Freshman: Evidence From Cross-Lagged Analysis.Michael Yao-Ping Peng & Zizai Zhang - 2022 - Frontiers in Psychology 12.
    View of future time orientation is a cognitive construct about future time. This view has its unique work of motivation and effect on academic performance. Previous studies have only explored the influence that future time orientation brings to the learning process at a single time, and most of them focus on cross-sectional studies. To further explore the cross-lagged relationship for freshmen between future time orientation and learning engagement during different periods, AMOS 23.0 was performed for cross-lagged analysis in (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  42.  12
    Building the Theory of Ecological Rationality.Peter M. Todd & Henry Brighton - 2016 - Minds and Machines 26 (1-2):9-30.
    While theories of rationality and decision making typically adopt either a single-powertool perspective or a bag-of-tricks mentality, the research program of ecological rationality bridges these with a theoretically-driven account of when different heuristic decision mechanisms will work well. Here we described two ways to study how heuristics match their ecological setting: The bottom-up approach starts with psychologically plausible building blocks that are combined to create simple heuristics that fit specific environments. The top-down approach starts from the statistical problem facing (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   15 citations  
  43. The Devil in the Data: Machine Learning & the Theory-Free Ideal.Mel Andrews - unknown
    Machine learning (ML) refers to a class of computer-facilitated methods of statistical modelling. ML modelling techniques are now being widely adopted across the sciences. A number of outspoken representatives from the general public, computer science, various scientific fields, and philosophy of science alike seem to share in the belief that ML will radically disrupt scientific practice or the variety of epistemic outputs science is capable of producing. Such a belief is held, at least in part, because its adherents (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  44.  1
    Philosophical Problems of Statistical Inference: Learning from R.A. Fisher.T. Seidenfeld - 1979 - Springer Verlag.
    Probability and inverse inference; Neyman-Pearson theory; Fisherian significance testing; The fiducial argument: one parameter; The fiducial argument: several parameters; Ian hacking's theory; Henry Kyburg's theory; Relevance and experimental design.
    Direct download  
     
    Export citation  
     
    Bookmark   10 citations  
  45.  10
    Statistical mechanics of learning: Generalization.Manfred Opper - 2002 - In Michael A. Arbib (ed.), The Handbook of Brain Theory and Neural Networks, Second Edition. MIT Press. pp. 922--925.
  46.  4
    Selective sampling in discrimination learning.David L. La Berge & Adrienne Smith - 1957 - Journal of Experimental Psychology 54 (6):423.
  47.  14
    Mathematical statistics and metastatistical analysis.Andrés Rivadulla - 1991 - Erkenntnis 34 (2):211 - 236.
    This paper deals with meta-statistical questions concerning frequentist statistics. In Sections 2 to 4 I analyse the dispute between Fisher and Neyman on the so called logic of statistical inference, a polemic that has been concomitant of the development of mathematical statistics. My conclusion is that, whenever mathematical statistics makes it possible to draw inferences, it only uses deductive reasoning. Therefore I reject Fisher's inductive approach to the statistical estimation theory and adhere to Neyman's deductive one. (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  48. The explanation game: a formal framework for interpretable machine learning.David S. Watson & Luciano Floridi - 2020 - Synthese 198 (10):1–⁠32.
    We propose a formal framework for interpretable machine learning. Combining elements from statistical learning, causal interventionism, and decision theory, we design an idealised explanation game in which players collaborate to find the best explanation for a given algorithmic prediction. Through an iterative procedure of questions and answers, the players establish a three-dimensional Pareto frontier that describes the optimal trade-offs between explanatory accuracy, simplicity, and relevance. Multiple rounds are played at different levels of abstraction, allowing the players (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  49.  6
    Statistical models for the induction and use of selectional preferences.Marc Light & Warren Greiff - 2002 - Cognitive Science 26 (3):269-281.
    Selectional preferences have a long history in both generative and computational linguistics. However, since the publication of Resnik's dissertation in 1993, a new approach has surfaced in the computational linguistics community. This new line of research combines knowledge represented in a pre‐defined semantic class hierarchy with statistical tools including information theory, statistical modeling, and Bayesian inference. These tools are used to learn selectional preferences from examples in a corpus. Instead of simple sets of semantic classes, selectional preferences (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  50.  23
    Cognitive Biases, Linguistic Universals, and Constraint‐Based Grammar Learning.Jennifer Culbertson, Paul Smolensky & Colin Wilson - 2013 - Topics in Cognitive Science 5 (3):392-424.
    According to classical arguments, language learning is both facilitated and constrained by cognitive biases. These biases are reflected in linguistic typology—the distribution of linguistic patterns across the world's languages—and can be probed with artificial grammar experiments on child and adult learners. Beginning with a widely successful approach to typology (Optimality Theory), and adapting techniques from computational approaches to statistical learning, we develop a Bayesian model of cognitive biases and show that it accounts for the detailed pattern (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   10 citations  
1 — 50 / 994