Uncertainty in Artificial Intelligence
First Name   Last Name   Password   Forgot Password   Log in!
    Proceedings         Authors   Author's Info   Article details         Search    
PEGASUS: A Policy Search Method for Large MDPs and POMDPs
Andrew Ng, Michael Jordan
Abstract:
We propose a new approach to the problem of searching a space of policies for a Markov decision process (MDP) or a partially observable Markov decision process (POMDP), given a model. Our approach is based on the following observation: Any (PO)MDP can be transformed into an "equivalent" POMDP in which all state transitions (given the current state and action) are deterministic. This reduces the general problem of policy search to one in which we need only consider POMDPs with deterministic transitions. We give a natural way of estimating the value of all policies in these transformed POMDPs. Policy search is then simply performed by searching for a policy with high estimated value. We also establish conditions under which our value estimates will be good, recovering theoretical results similar to those of Kearns, Mansour and Ng (1999), but with "sample complexity" bounds that have only a polynomial rather than exponential dependence on the horizon time. Our method applies to arbitrary POMDPs, including ones with infinite state and action spaces. We also present empirical results for our approach on a small discrete problem, and on a complex continuous state/continuous action problem involving learning to ride a bicycle.
Keywords: Policy search, MDPs, POMDPs, reinforcement learning
Pages: 406-415
PS Link: http://www.cs.berkeley.edu/~ang/papers/uai00-pegasus.ps
PDF Link: /papers/00/p406-ng.pdf
BibTex:
@INPROCEEDINGS{Ng00,
AUTHOR = "Andrew Ng and Michael Jordan",
TITLE = "PEGASUS: A Policy Search Method for Large MDPs and POMDPs",
BOOKTITLE = "Proceedings of the Sixteenth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-00)",
PUBLISHER = "Morgan Kaufmann",
ADDRESS = "San Francisco, CA",
YEAR = "2000",
PAGES = "406--415"
}


hosted by DSL   •   site info   •   help