Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings   Proceeding details   Article details         Authors         Search    
Inference by Minimizing Size, Divergence, or their Sum
Sebastian Riedel, David Smith, Andrew McCallum
Abstract:
We speed up marginal inference by ignoring factors that do not significantly contribute to overall accuracy. In order to pick a suitable subset of factors to ignore, we propose three schemes: minimizing the number of model factors under a bound on the KL divergence between pruned and full models; minimizing the KL divergence under a bound on factor count; and minimizing the weighted sum of KL divergence and factor count. All three problems are solved using an approximation of the KL divergence than can be calculated in terms of marginals computed on a simple seed graph. Applied to synthetic image denoising and to three different types of NLP parsing models, this technique performs marginal inference up to 11 times faster than loopy BP, with graph sizes reduced up to 98%-at comparable error in marginals and parsing accuracy. We also show that minimizing the weighted sum of divergence and size is substantially faster than minimizing either of the other objectives based on the approximation to divergence presented here.
Keywords:
Pages: 492-499
PS Link:
PDF Link: /papers/10/p492-riedel.pdf
BibTex:
@INPROCEEDINGS{Riedel10,
AUTHOR = "Sebastian Riedel and David Smith and Andrew McCallum",
TITLE = "Inference by Minimizing Size, Divergence, or their Sum",
BOOKTITLE = "Proceedings of the Twenty-Sixth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-10)",
PUBLISHER = "AUAI Press",
ADDRESS = "Corvallis, Oregon",
YEAR = "2010",
PAGES = "492--499"
}


hosted by DSL   •   site info   •   help