2009 ModelingSyntacticStructure
Jump to navigation
Jump to search
- (Jiang, 2009) ⇒ Jing Jiang. (2009). “Modeling Syntactic Structures of Topics with a Nested HMM-LDA.” In: Proceedings of the Ninth IEEE International Conference on Data Mining (ICDM 2009). doi:10.1109/ICDM.2009.144
Subject Headings:
Notes
Cited by
Quotes
Abstract
- Latent Dirichlet allocation (LDA) is a commonly used topic modeling method for text analysis and mining. Standard LDA treats documents as bags of words, ignoring the syntactic structures of sentences. In this paper, we propose a hybrid model that embeds hidden Markov models (HMMs) within LDA topics to jointly model both the topics and the syntactic structures within each topic. Our model is general and subsumes standard LDA and HMM as special cases. Compared with standard LDA and HMM, our model can simultaneously discover both topic-specific content words and background functional words shared among topics. Our model can also automatically separate content words that play different roles within a topic. Using perplexity as evaluation metric, our model returns lower perplexity for unseen test documents compared with standard LDA, which shows its better generalization power than LDA.
,
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2009 ModelingSyntacticStructure | Jing Jiang | Modeling Syntactic Structures of Topics with a Nested HMM-LDA | ICDM 2009 Proceedings | 10.1109/ICDM.2009.144 | 2009 |