Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings   Proceeding details   Article details         Authors         Search    
Update Rules for Parameter Estimation in Bayesian Networks
Eric Bauer, Daphne Koller, Yoram Singer
Abstract:
This paper re-examines the problem of parameter estimation in Bayesian networks with missing values and hidden variables from the perspective of recent work in on-line learning [Kivinen & Warmuth, 1994]. We provide a unified framework for parameter estimation that encompasses both on-line learning, where the model is continuously adapted to new data cases as they arrive, and the more traditional batch learning, where a pre-accumulated set of samples is used in a one-time model selection process. In the batch case, our framework encompasses both the gradient projection algorithm and the EM algorithm for Bayesian networks. The framework also leads to new on-line and batch parameter update schemes, including a parameterized version of EM. We provide both empirical and theoretical results indicating that parameterized EM allows faster convergence to the maximum likelihood parameters than does standard EM.
Keywords: Learning Bayesian networks, online learning, EM, parameter estimation, hidden variab
Pages: 3-13
PS Link: http://robotics.stanford.edu/~koller/papers/uai97em.ps
PDF Link: /papers/97/p3-bauer.pdf
BibTex:
@INPROCEEDINGS{Bauer97,
AUTHOR = "Eric Bauer and Daphne Koller and Yoram Singer",
TITLE = "Update Rules for Parameter Estimation in Bayesian Networks",
BOOKTITLE = "Proceedings of the Thirteenth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-97)",
PUBLISHER = "Morgan Kaufmann",
ADDRESS = "San Francisco, CA",
YEAR = "1997",
PAGES = "3--13"
}


hosted by DSL   •   site info   •   help