Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings         Authors   Author's Info   Article details         Search    
Relative Loss Bounds for On-line Density Estimation with the Exponential Family of Distributions
Katy Azoury, Manfred Warmuth
Abstract:
We consider on-line density estimation with a parameterized density from the exponential family. The on-line algorithm receives one example at a time and maintains a parameter that is essentially an average of the past examples. After receiving an example the algorithm incurs a loss which is the negative log-likelihood of the example w.r.t. the past parameter of the algorithm. An off-line algorithm can choose the best parameter based on all the examples. We prove bounds on the additional total loss of the on-line algorithm over the total loss of the off-line algorithm. These relative loss bounds hold for an arbitrary sequence of examples. The goal is to design algorithms with the best possible relative loss bounds. We use a certain divergence to derive and analyze the algorithms. This divergence is a relative entropy between two exponential distributions.
Keywords:
Pages: 31-40
PS Link:
PDF Link: /papers/99/p31-azoury.pdf
BibTex:
@INPROCEEDINGS{Azoury99,
AUTHOR = "Katy Azoury and Manfred Warmuth",
TITLE = "Relative Loss Bounds for On-line Density Estimation with the Exponential Family of Distributions",
BOOKTITLE = "Proceedings of the Fifteenth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-99)",
PUBLISHER = "Morgan Kaufmann",
ADDRESS = "San Francisco, CA",
YEAR = "1999",
PAGES = "31--40"
}


hosted by DSL   •   site info   •   help