NIPS Proceedingsβ

Word Features for Latent Dirichlet Allocation

Part of: Advances in Neural Information Processing Systems 23 (NIPS 2010)

[PDF] [BibTeX] [Supplemental]



We extend Latent Dirichlet Allocation (LDA) by explicitly allowing for the encoding of side information in the distribution over words. This results in a variety of new capabilities, such as improved estimates for infrequently occurring words, as well as the ability to leverage thesauri and dictionaries in order to boost topic cohesion within and across languages. We present experiments on multi-language topic synchronisation where dictionary information is used to bias corresponding words towards similar topics. Results indicate that our model substantially improves topic cohesion when compared to the standard LDA model.