Learning Bayesian Networks with Local Structure
Nir Friedman, Moises Goldszmidt
In this paper we examine a novel addition to the known methods for learning Bayesian networks from data that improves the quality of the learned networks. Our approach explicitly represents and learns the local structure in the conditional probability tables (CPTs), that quantify these networks. This increases the space of possible models, enabling the representation of CPTs with a variable number of parameters that depends on the learned local structures. The resulting learning procedure is capable of inducing models that better emulate the real complexity of the interactions present in the data. We describe the theoretical foundations and practical aspects of learning local structures, as well as an empirical evaluation of the proposed method. This evaluation indicates that learning curves characterizing the procedure that exploits the local structure converge faster than these of the standard procedure. Our results also show that networks learned with local structure tend to be more complex (in terms of arcs), yet require less parameters.
Keywords: Learning Bayesian networks, MDL scoring metric.
PS Link: ftp://starry.stanford.edu/pub/nir/FrG3.ps
PDF Link: /papers/96/p252-friedman.pdf
AUTHOR = "Nir Friedman
and Moises Goldszmidt",
TITLE = "Learning Bayesian Networks with Local Structure",
BOOKTITLE = "Proceedings of the Twelfth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-96)",
PUBLISHER = "Morgan Kaufmann",
ADDRESS = "San Francisco, CA",
YEAR = "1996",
PAGES = "252--262"