SparsityBoost: A New Scoring Function for Learning Bayesian Network Structure
Eliot Brenner, David Sontag
Abstract:
We give a new consistent scoring function for structure learning of Bayesian networks. In contrast to traditional approaches to scorebased structure learning, such as BDeu or MDL, the complexity penalty that we propose is datadependent and is given by the probability that a conditional independence test correctly shows that an edge cannot exist. What really distinguishes this new scoring function from earlier work is that it has the property of becoming computationally easier to maximize as the amount of data increases. We prove a polynomial sample complexity result, showing that maximizing this score is guaranteed to correctly learn a structure with no false edges and a distribution close to the generating distribution, whenever there exists a Bayesian network which is a perfect map for the data generating distribution. Although the new score can be used with any search algorithm, we give empirical results showing that it is particularly effective when used together with a linear programming relaxation approach to Bayesian network structure learning.
Keywords:
Pages: 112121
PS Link:
PDF Link: /papers/13/p112brenner.pdf
BibTex:
@INPROCEEDINGS{Brenner13,
AUTHOR = "Eliot Brenner
and David Sontag",
TITLE = "SparsityBoost: A New Scoring Function for Learning Bayesian Network Structure",
BOOKTITLE = "Proceedings of the TwentyNinth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI13)",
PUBLISHER = "AUAI Press",
ADDRESS = "Corvallis, Oregon",
YEAR = "2013",
PAGES = "112121"
}

