Skip to content

DeepLearning4BioSeqText/Paper10-SparseLSA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

SparseLSA

Sparse Latent Semantic Indexing

Please cite the following paper if you use the code. @INPROCEEDINGS{sdm2011slsa, author = {X. Chen and Y. Qi and B. Bai and Q. Lin and J.G. Carbonell }, title = {Sparse Latent Semantic Analysis}, booktitle = {SIAM International Conference on Data Mining (SDM)}, year = {2011}, bib2html_pubtype = {Refereed Conference}, }


Abstract:

Latent semantic analysis (LSA), as one of the most pop- ular unsupervised dimension reduction tools, has a wide range of applications in text mining and information re- trieval. The key idea of LSA is to learn a projection matrix that maps the high dimensional vector space representations of documents to a lower dimensional la- tent space, i.e. so called latent topic space. In this pa- per, we propose a new model called Sparse LSA, which produces a sparse projection matrix via the l1 regu- larization. Compared to the traditional LSA, Sparse LSA selects only a small number of relevant words for each topic and hence provides a compact representation of topic-word relationships. Moreover, Sparse LSA is computationally very efficient with much less memory usage for storing the projection matrix. Furthermore, we propose two important extensions of Sparse LSA: group structured Sparse LSA and non-negative Sparse LSA. We conduct experiments on several benchmark datasets and compare Sparse LSA and its extensions with several widely used methods, e.g. LSA, Sparse Coding and LDA. Empirical results suggest that Sparse LSA achieves similar performance gains to LSA, but is more efficient in projection computation, storage, and also well explain the topic-word relationships.


Paper PDF: http://www.cs.cmu.edu/%7Eqyj/papersA08/SLSA-sdm11.pdf


Talk Slide PDF: http://www.cs.cmu.edu/%7Eqyj/papersA08/11-talk-sdm11-slsa.pdf

About

Sparse Latent Semantic Indexing

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published