Relative Loss Bounds for Online Density Estimation with the Exponential Family of Distributions
Katy Azoury, Manfred Warmuth
Abstract:
We consider online density estimation with a parameterized density from the exponential family. The online algorithm receives one example at a time and maintains a parameter that is essentially an average of the past examples. After receiving an example the algorithm incurs a loss which is the negative loglikelihood of the example w.r.t. the past parameter of the algorithm. An offline algorithm can choose the best parameter based on all the examples. We prove bounds on the additional total loss of the online algorithm over the total loss of the offline algorithm. These relative loss bounds hold for an arbitrary sequence of examples. The goal is to design algorithms with the best possible relative loss bounds. We use a certain divergence to derive and analyze the algorithms. This divergence is a relative entropy between two exponential distributions.
Keywords:
Pages: 3140
PS Link:
PDF Link: /papers/99/p31azoury.pdf
BibTex:
@INPROCEEDINGS{Azoury99,
AUTHOR = "Katy Azoury
and Manfred Warmuth",
TITLE = "Relative Loss Bounds for Online Density Estimation with the Exponential Family of Distributions",
BOOKTITLE = "Proceedings of the Fifteenth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI99)",
PUBLISHER = "Morgan Kaufmann",
ADDRESS = "San Francisco, CA",
YEAR = "1999",
PAGES = "3140"
}

