Efficient Approximations for the Marginal Likelihood of Incomplete Data Given a Bayesian Network
David Chickering, David Heckerman
We discuss Bayesian methods for learning Bayesian networks when data sets are incomplete. In particular, we examine asymptotic approximations for the marginal likelihood of incomplete data given a Bayesian network. We consider the Laplace approximation and the less accurate but more efficient BIC/MDL approximation. We also consider approximations proposed by Draper (1993) and Cheeseman and Stutz (1995). These approximations are as efficient as BIC/MDL, but their accuracy has not been studied in any depth. We compare the accuracy of these approximations under the assumption that the Laplace approximation is the most accurate. In experiments using synthetic data generated from discrete naive-Bayes models having a hidden root node, we find that the CS measure is the most accurate.
Keywords: Bayesian model averaging, model selection, marginal likelihood,
PS Link: ftp://ftp.research.microsoft.com/pub/Tech-Reports/Winter95-96/TR-9
PDF Link: /papers/96/p158-chickering.pdf
AUTHOR = "David Chickering
and David Heckerman",
TITLE = "Efficient Approximations for the Marginal Likelihood of Incomplete Data Given a Bayesian Network",
BOOKTITLE = "Proceedings of the Twelfth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-96)",
PUBLISHER = "Morgan Kaufmann",
ADDRESS = "San Francisco, CA",
YEAR = "1996",
PAGES = "158--168"