Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings         Authors   Author's Info   Article details         Search    
Inference for Belief Networks Using Coupling From the Past
Michael Harvey, Radford Neal
Abstract:
Inference for belief networks using Gibbs sampling produces a distribution for unobserved variables that differs from the correct distribution by a (usually) unknown error, since convergence to the right distribution occurs only asymptotically. The method of ``coupling from the past'' samples from exactly the correct distribution by (conceptually) running dependent Gibbs sampling simulations from every possible starting state from a time far enough in the past that all runs reach the same state at time t=0. Explicitly considering every possible state is intractable for large networks, however. We propose a method for layered noisy-or networks that uses a compact, but often imprecise, summary of a set of states. This method samples from exactly the correct distribution, and requires only about twice the time per step as ordinary Gibbs sampling, but it may require more simulation steps than would be needed if chains were tracked exactly.
Keywords:
Pages: 256-263
PS Link: ftp://www.cs.toronto.edu/pub/radford/uai-exact.ps.Z
PDF Link: /papers/00/p256-harvey.pdf
BibTex:
@INPROCEEDINGS{Harvey00,
AUTHOR = "Michael Harvey and Radford Neal",
TITLE = "Inference for Belief Networks Using Coupling From the Past",
BOOKTITLE = "Proceedings of the Sixteenth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-00)",
PUBLISHER = "Morgan Kaufmann",
ADDRESS = "San Francisco, CA",
YEAR = "2000",
PAGES = "256--263"
}


hosted by DSL   •   site info   •   help