Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings   Proceeding details   Article details         Authors         Search    
Hybrid Variational/Gibbs Collapsed Inference in Topic Models
Max Welling, Yee Whye Teh, Hilbert Kappen
Abstract:
Variational Bayesian inference and (collapsed) Gibbs sampling are the two important classes of inference algorithms for Bayesian networks. Both have their advantages and disadvantages: collapsed Gibbs sampling is unbiased but is also inefficient for large count values and requires averaging over many samples to reduce variance. On the other hand, variational Bayesian inference is efficient and accurate for large count values but suffers from bias for small counts. We propose a hybrid algorithm that combines the best of both worlds: it samples very small counts and applies variational updates to large counts. This hybridization is shown to significantly improve testset perplexity relative to variational inference at no computational cost.
Keywords: null
Pages: 587-594
PS Link:
PDF Link: /papers/08/p587-welling.pdf
BibTex:
@INPROCEEDINGS{Welling08,
AUTHOR = "Max Welling and Yee Whye Teh and Hilbert Kappen",
TITLE = "Hybrid Variational/Gibbs Collapsed Inference in Topic Models",
BOOKTITLE = "Proceedings of the Twenty-Fourth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-08)",
PUBLISHER = "AUAI Press",
ADDRESS = "Corvallis, Oregon",
YEAR = "2008",
PAGES = "587--594"
}


hosted by DSL   •   site info   •   help