Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings         Authors   Author's Info   Article details         Search    
Obtaining Calibrated Probabilities from Boosting
Alexandru Niculescu-Mizil, Richard Caruana
Boosted decision trees typically yield good accuracy, precision, and ROC area. However, because the outputs from boosting are not well calibrated posterior probabilities, boosting yields poor squared error and cross-entropy. We empirically demonstrate why AdaBoost predicts distorted probabilities and examine three calibration methods for correcting this distortion: Platt Scaling, Isotonic Regression, and Logistic Correction. We also experiment with boosting using log-loss instead of the usual exponential loss. Experiments show that Logistic Correction and boosting with log-loss work well when boosting weak models such as decision stumps, but yield poor performance when boosting more complex models such as full decision trees. Platt Scaling and Isotonic Regression, however, significantly improve the probabilities predicted by
Pages: 413-420
PS Link:
PDF Link: /papers/05/p413-niculescu-mizil.pdf
AUTHOR = "Alexandru Niculescu-Mizil and Richard Caruana",
TITLE = "Obtaining Calibrated Probabilities from Boosting",
BOOKTITLE = "Proceedings of the Twenty-First Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-05)",
ADDRESS = "Arlington, Virginia",
YEAR = "2005",
PAGES = "413--420"

hosted by DSL   •   site info   •   help