Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings   Proceeding details   Article details         Authors         Search    
Robust Feature Selection by Mutual Information Distributions
Marco Zaffalon, Marcus Hutter
Abstract:
Mutual information is widely used in artificial intelligence, in a descriptive way, to measure the stochastic dependence of discrete random variables. In order to address questions such as the reliability of the empirical value, one must consider sample-to-population inferential approaches. This paper deals with the distribution of mutual information, as obtained in a Bayesian framework by a second-order Dirichlet prior distribution. The exact analytical expression for the mean and an analytical approximation of the variance are reported. Asymptotic approximations of the distribution are proposed. The results are applied to the problem of selecting features for incremental learning and classification of the naive Bayes classifier. A fast, newly defined method is shown to outperform the traditional approach based on empirical mutual information on a number of real data sets. Finally, a theoretical development is reported that allows one to efficiently extend the above methods to incomplete samples in an easy and effective way.
Keywords: Mutual information, cross entropy, Dirichlet distribution, second order distribution,
Pages: 577-584
PS Link: http://www.idsia.ch/~marcus/ai/feature.ps
PDF Link: /papers/02/p577-zaffalon.pdf
BibTex:
@INPROCEEDINGS{Zaffalon02,
AUTHOR = "Marco Zaffalon and Marcus Hutter",
TITLE = "Robust Feature Selection by Mutual Information Distributions",
BOOKTITLE = "Proceedings of the Eighteenth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-02)",
PUBLISHER = "Morgan Kaufmann",
ADDRESS = "San Francisco, CA",
YEAR = "2002",
PAGES = "577--584"
}


hosted by DSL   •   site info   •   help