Exploiting Qualitative Knowledge in the Learning of Conditional Probabilities of Bayesian Networks
Frank Wittig, Anthony Jameson
Algorithms for learning the conditional probabilities of Bayesian networks with hidden variables typically operate within a high-dimensional search space and yield only locally optimal solutions. One way of limiting the search space and avoiding local optima is to impose qualitative constraints that are based on background knowledge concerning the domain. We present a method for integrating formal statements of qualitative constraints into two learning algorithms, APN and EM. In our experiments with synthetic data, this method yielded networks that satisfied the constraints almost perfectly. The accuracy of the learned networks was consistently superior to that of corresponding networks learned without constraints. The exploitation of qualitative constraints therefore appears to be a promising way to increase both the interpretability and the accuracy of learned Bayesian networks with known structure.
Keywords: Bayesian network learning, qualitative probabilistic networks, interpretability, back
PS Link: http://w5.cs.uni-sb.de/~fwittig/publications/uai00.wittig.ps
PDF Link: /papers/00/p644-wittig.pdf
AUTHOR = "Frank Wittig
and Anthony Jameson",
TITLE = "Exploiting Qualitative Knowledge in the Learning of Conditional Probabilities of Bayesian Networks",
BOOKTITLE = "Proceedings of the Sixteenth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-00)",
PUBLISHER = "Morgan Kaufmann",
ADDRESS = "San Francisco, CA",
YEAR = "2000",
PAGES = "644--652"