Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings   Proceeding details   Article details         Authors         Search    
Plausibility Measures: A User's Guide
Nir Friedman, Joseph Halpern
Abstract:
We examine a new approach to modeling uncertainty based on plausibility measures, where a plausibility measure just associates with an event its plausibility, an element is some partially ordered set. This approach is easily seen to generalize other approaches to modeling uncertainty, such as probability measures, belief functions, and possibility measures. The lack of structure in a plausibility measure makes it easy for us to add structure on an "as needed" basis, letting us examine what is required to ensure that a plausibility measure has certain properties of interest. This gives us insight into the essential features of the properties in question, while allowing us to prove general results that apply to many approaches to reasoning about uncertainty. Plausibility measures have already proved useful in analyzing default reasoning. In this paper, we examine their "algebraic properties," analogues to the use of + and * in probability theory. An understanding of such properties will be essential if plausibility measures are to be used in practice as a representation tool.
Keywords: Foundations of uncertainty concepts: uncertainty measures, independence, default rea
Pages: 175-184
PS Link: http://www.cs.cornell.edu/home/halpern/papers/plausibility_manual.ps
PDF Link: /papers/95/p175-friedman.pdf
BibTex:
@INPROCEEDINGS{Friedman95,
AUTHOR = "Nir Friedman and Joseph Halpern",
TITLE = "Plausibility Measures: A User's Guide",
BOOKTITLE = "Proceedings of the Eleventh Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-95)",
PUBLISHER = "Morgan Kaufmann",
ADDRESS = "San Francisco, CA",
YEAR = "1995",
PAGES = "175--184"
}


hosted by DSL   •   site info   •   help