Integrating Probabilistic Rules into Neural Networks: A Stochastic EM Learning Algorithm
The EM-algorithm is a general procedure to get maximum likelihood estimates if part of the observations on the variables of a network are missing. In this paper a stochastic version of the algorithm is adapted to probabilistic neural networks describing the associative dependency of variables. These networks have a probability distribution, which is a special case of the distribution generated by probabilistic inference networks. Hence both types of networks can be combined allowing to integrate probabilistic rules as well as unspecified associations in a sound way. The resulting network may have a number of interesting features including cycles of probabilistic rules, hidden 'unobservable' variables, and uncertain and contradictory evidence.
PDF Link: /papers/91/p264-paass.pdf
AUTHOR = "Gerhard PaaĆ?
TITLE = "Integrating Probabilistic Rules into Neural Networks: A Stochastic EM Learning Algorithm",
BOOKTITLE = "Proceedings of the Seventh Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-91)",
PUBLISHER = "Morgan Kaufmann",
ADDRESS = "San Mateo, CA",
YEAR = "1991",
PAGES = "264--270"