Rethinking LDA: Why Priors Matter, Hanna M. Wallach David Mimno Andrew McCallum, NIPS 2009
Abstract:
Implementations of topic models typically use symmetric Dirichlet priors with fixed concentration parameters, with the implicit assumption that such “smoothing parameters” have little practical effect. In this paper, we explore several classes of structured priors for topic models. We find that an asymmetric Dirichlet prior over the document–topic distributions has substantial advantages over a symmetric prior, while an asymmetric prior over the topic–word distributions provides no real benefit. Approximation of this prior structure through simple, efficient hyperparameter optimization steps is sufficient to achieve these performance gains. The prior structure we advocate substantially increases the robustness of topic models to variations in the number of topics and to the highly skewed word frequency distributions common in natural language. Since this prior structure can be implemented using efficient algorithms that add negligible cost beyond standard inference techniques, we recommend it as a new standard for topic modeling.
Generative AI to quantify uncertainty in weather forecasting
-
Posted by Lizao (Larry) Li, Software Engineer, and Rob Carver, Research
Scientist, Google Research
Accurate weather forecasts can have a direct impact on ...
8 months ago
No comments:
Post a Comment