|
|
HELM: Highly Efficient Learning of Mixed copula networks
Yaniv Tenzer, Gal Elidan
Abstract:
Learning the structure of probabilistic graphi-
cal models for complex real-valued domains is
a formidable computational challenge. This in-
evitably leads to significant modelling compro-
mises such as discretization or the use of a sim-
plistic Gaussian representation. In this work we
address the challenge of efficiently learning truly
expressive copula-based networks that facilitate
a mix of varied copula families within the same
model. Our approach is based on a simple but
powerful bivariate building block that is used to
highly efficiently perform local model selection,
thus bypassing much of computational burden in-
volved in structure learning. We show how this
building block can be used to learn general net-
works and demonstrate its effectiveness on var-
ied and sizeable real-life domains. Importantly,
favorable identification and generalization per-
formance come with dramatic runtime improve-
ments. Indeed, the benefits are such that they
allow us to tackle domains that are prohibitive
when using a standard learning approaches.
Keywords:
Pages: 790-799
PS Link:
PDF Link: /papers/14/p790-tenzer.pdf
BibTex:
@INPROCEEDINGS{Tenzer14,
AUTHOR = "Yaniv Tenzer
and Gal Elidan",
TITLE = "HELM: Highly Efficient Learning of Mixed copula networks",
BOOKTITLE = "Proceedings of the Thirtieth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-14)",
PUBLISHER = "AUAI Press",
ADDRESS = "Corvallis, Oregon",
YEAR = "2014",
PAGES = "790--799"
}
|
|