|
|
Sequential Model-Based Ensemble Optimization
Alexandre Lacoste, Hugo Larochelle, Mario Marchand, Francois Laviolette
Abstract:
One of the most tedious tasks in the applica-
tion of machine learning is model selection, i.e.
hyperparameter selection. Fortunately, recent
progress has been made in the automation of this
process, through the use of sequential model-
based optimization (SMBO) methods. This can
be used to optimize a cross-validation perfor-
mance of a learning algorithm over the value of
its hyperparameters. However, it is well known
that ensembles of learned models almost consis-
tently outperform a single model, even if prop-
erly selected. In this paper, we thus propose an
extension of SMBO methods that automatically
constructs such ensembles. This method builds
on a recently proposed ensemble construction
paradigm known as Agnostic Bayesian learning.
In experiments on 22 regression and 39 classifi-
cation data sets, we confirm the success of this
proposed approach, which is able to outperform
model selection with SMBO.
Keywords:
Pages: 440-448
PS Link:
PDF Link: /papers/14/p440-lacoste.pdf
BibTex:
@INPROCEEDINGS{Lacoste14,
AUTHOR = "Alexandre Lacoste
and Hugo Larochelle and Mario Marchand and Francois Laviolette",
TITLE = "Sequential Model-Based Ensemble Optimization",
BOOKTITLE = "Proceedings of the Thirtieth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-14)",
PUBLISHER = "AUAI Press",
ADDRESS = "Corvallis, Oregon",
YEAR = "2014",
PAGES = "440--448"
}
|
|