Latent semantic analysis (LSA), a disembodied learning machine, acquires human word meaning vicariously from language alone

Behavioral and Brain Sciences 22 (4):624-625 (1999)
  Copy   BIBTEX

Abstract

The hypothesis that perceptual mechanisms could have more representational and logical power than usually assumed is interesting and provocative, especially with regard to brain evolution. However, the importance of embodiment and grounding is exaggerated, and the implication that there is no highly abstract representation at all, and that human-like knowledge cannot be learned or represented without human bodies, is very doubtful. A machine-learning model, Latent Semantic Analysis (LSA) that closely mimics human word and passage meaning relations is offered as a counterexample.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,219

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Analytics

Added to PP
2009-01-28

Downloads
57 (#269,932)

6 months
7 (#350,235)

Historical graph of downloads
How can I increase my downloads?