Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings         Authors   Author's Info   Article details         Search    
Asymptotically Exact, Embarrassingly Parallel MCMC
Willie Neiswanger, Chong Wang, Eric Xing
Communication costs, resulting from synchro- nization requirements during learning, can greatly slow down many parallel machine learning algorithms. In this paper, we present a parallel Markov chain Monte Carlo (MCMC) algorithm in which subsets of data are pro- cessed independently, with very little com- munication. First, we arbitrarily partition data onto multiple machines. Then, on each machine, any classical MCMC method (e.g., Gibbs sampling) may be used to draw samples from a posterior distribution given the data subset. Finally, the samples from each ma- chine are combined to form samples from the full posterior. This embarrassingly parallel algorithm allows each machine to act inde- pendently on a subset of the data (without communication) until the final combination stage. We prove that our algorithm generates asymptotically exact samples and empirically demonstrate its ability to parallelize burn-in and sampling in several models.
Pages: 623-632
PS Link:
PDF Link: /papers/14/p623-neiswanger.pdf
AUTHOR = "Willie Neiswanger and Chong Wang and Eric Xing",
TITLE = "Asymptotically Exact, Embarrassingly Parallel MCMC",
BOOKTITLE = "Proceedings of the Thirtieth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-14)",
ADDRESS = "Corvallis, Oregon",
YEAR = "2014",
PAGES = "623--632"

hosted by DSL   •   site info   •   help