Multi-label classification using stacked hierarchical dirichlet processes with reduced sampling complexity
Wu, Xindong (Hrsg). 2017 IEEE International Conference on Big Knowledge : 9-10 August 2017, Hefei, China : proceedings. Piscataway, NJ: IEEE 2017 S. 1 - 8
Erscheinungsjahr: 2017
ISBN/ISSN: 978-1-5386-3120-1 ; 978-1-5386-3121-8
Publikationstyp: Buchbeitrag (Konferenzbeitrag)
Sprache: Englisch
Geprüft | Bibliothek |
Inhaltszusammenfassung
Nonparametric topic models based on hierarchical Dirichlet processes (HDPs) allow for the number of topics to be automatically discovered from the data. The computational complexity of standard Gibbs sampling techniques for model training is linear in the number of topics. Recently, it was reduced to be linear in the number of topics per word using a technique called alias-sampling combined with Metropolis Hastings (MH) sampling. We propose a different proposal distribution for the MH step ba...Nonparametric topic models based on hierarchical Dirichlet processes (HDPs) allow for the number of topics to be automatically discovered from the data. The computational complexity of standard Gibbs sampling techniques for model training is linear in the number of topics. Recently, it was reduced to be linear in the number of topics per word using a technique called alias-sampling combined with Metropolis Hastings (MH) sampling. We propose a different proposal distribution for the MH step based on the observation that distributions on the upper hierarchy level change slower than the document specific distributions at the lower level. This reduces the sampling complexity, making it linear in the number of topics per document, at unchanged test set log-likelihood. Furthermore, we propose a novel model of stacked HDPs utilizing this sampling method. Experiments demonstrate the effectiveness of the proposed approach in the context of multi-label classification.» weiterlesen» einklappen
Autoren
Klassifikation
DFG Fachgebiet:
Informatik
DDC Sachgruppe:
Informatik