L-EnsNMF

Boosted Local Topic Discovery via Ensemble of Nonnegative Matrix Factorization 🏆

Abstract

Nonnegative matrix factorization (NMF) has been widely applied in many domains. In document analysis, it has been increasingly used in topic modeling applications, where a set of underlying topics are revealed by a low-rank factor matrix from NMF. However, it is often the case that the resulting topics give only general topic information in the data, which tends not to convey much information. To tackle this problem, we propose a novel ensemble model of nonnegative matrix factorization for discovering high-quality local topics. Our method leverages the idea of an ensemble model, which has been successful in supervised learning, into an unsupervised topic modeling context. That is, our model successively performs NMF given a residual matrix obtained from previous stages and generates a sequence of topic sets. Our algorithm for updating the input matrix has novelty in two aspects. The first lies in utilizing the residual matrix inspired by a state-of-the-art gradient boosting model, and the second stems from applying a sophisticated local weighting scheme on the given matrix to enhance the locality of topics, which in turn delivers high-quality, focused topics of interest to users. We evaluate our proposed method by comparing it against other topic modeling methods, such as a few variants of NMF and latent Dirichlet allocation, in terms of various evaluation measures representing topic coherence, diversity, coverage, computing time, and so on. We also present qualitative evaluation on the topics discovered by our method using several real-world data sets.


Below are the slides I used during the demo. Click here to see in full size.

Want to learn more?

Cite this work

@inproceedings{suh2016ensnmf, title={L-ensnmf: Boosted local topic discovery via ensemble of nonnegative matrix factorization}, author={Suh, Sangho and Choo, Jaegul and Lee, Joonseok and Reddy, Chandan K}, booktitle={2016 IEEE 16th International Conference on Data Mining (ICDM)}, pages={479--488}, year={2016}, organization={IEEE} }