Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings         Authors   Author's Info   Article details         Search    
Inductive Inference and the Representation of Uncertainty
Norman Dalkey
Abstract:
The form and justification of inductive inference rules depend strongly on the representation of uncertainty. This paper examines one generic representation, namely, incomplete information. The notion can be formalized by presuming that the relevant probabilities in a decision problem are known only to the extent that they belong to a class K of probability distributions. The concept is a generalization of a frequent suggestion that uncertainty be represented by intervals or ranges on probabilities. To make the representation useful for decision making, an inductive rule can be formulated which determines, in a well-defined manner, a best approximation to the unknown probability, given the set K. In addition, the knowledge set notion entails a natural procedure for updating -- modifying the set K given new evidence. Several non-intuitive consequences of updating emphasize the differences between inference with complete and inference with incomplete information.
Keywords: Inductive Inference Rules, Representation of Uncertainty
Pages: 393-397
PS Link:
PDF Link: /papers/85/p393-dalkey.pdf
BibTex:
@INPROCEEDINGS{Dalkey85,
AUTHOR = "Norman Dalkey ",
TITLE = "Inductive Inference and the Representation of Uncertainty",
BOOKTITLE = "Uncertainty in Artificial Intelligence Annual Conference on Uncertainty in Artificial Intelligence (UAI-85)",
PUBLISHER = "Elsevier Science",
ADDRESS = "Amsterdam, NL",
YEAR = "1985",
PAGES = "393--397"
}


hosted by DSL   •   site info   •   help