- NB-BoW: Naive Bayes bag of words, i.e., mixture of unigram
- pLSA: probabilistic latent semantic analysis
- LDA: latent Dirichlet allocation
- FMM: finite mixture model
- FHMM: finite hierarchical mixture model
- DPMM: Dirichlet mixture model
- HDP: hierarchical Dirichlet process mixture model
- w-> x denotes to generalize the word x from a categorical variable to a
real variable, x, that can be either discrete or continuous - hierarcy denotes to add a hierarchy to the original model
- K topics denotes to extend from one topic per document to multiple topics
- K -> \infty denotes to derive an infinite limit of the original model.
Some interesting observations: there are three paths to evolve a NB-BoW model to a
HDP-MM model, which we always need to perform all the above four extensions
no matter which path we choose.
No comments:
Post a Comment