The Hidden Markov Topic Model: A Probabilistic Model of Semantic Representation

Topics in Cognitive Science 2 (1):101-113 (2010)
  Copy   BIBTEX


In this paper, we describe a model that learns semantic representations from the distributional statistics of language. This model, however, goes beyond the common bag‐of‐words paradigm, and infers semantic representations by taking into account the inherent sequential nature of linguistic data. The model we describe, which we refer to as a Hidden Markov Topics model, is a natural extension of the current state of the art in Bayesian bag‐of‐words models, that is, the Topics model of Griffiths, Steyvers, and Tenenbaum (2007), preserving its strengths while extending its scope to incorporate more fine‐grained linguistic information.



    Upload a copy of this work     Papers currently archived: 89,703

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles


Added to PP

56 (#252,701)

6 months
4 (#313,854)

Historical graph of downloads
How can I increase my downloads?