Skip to content

Latest commit

 

History

History
3 lines (2 loc) · 654 Bytes

slrtm.md

File metadata and controls

3 lines (2 loc) · 654 Bytes

TLDR; The authors train an RNN-based topic model that takes into consideration word order and assumes words in the same sentence sahre the same topic. The authors sample topic mixtures from a Dirichlet distribution and then train a "topic embedding" together with the rest of the generative LSTM. The model is evaluated quantitatively using perplexity on generated sentences and on classification tasks and clearly beats competing models. Qualitative evaluation shows that the model can generate sensible sentences conditioned on the topic.