Probabilistic Decision-Making Under Model Uncertainty

Abstract

Partially Observable Markov Decision Processes offer a rich mathematical framework for decision-making under uncertainty. In recent years, a number of methods have been developed to optimize the choice of action, given a parametric model of the domain. In many applications, however, this model must be learned using a finite set of trajectories. When this data proves difficult or expensive to collect, it is often the case that the resulting model is poorly or imprecisely defined.

In this talk, I will present two recent results on the topic of decision-making under model uncertainty. In the first half, I will describe a method for estimating the bias and variance of the value function in terms of the statistics of the empirical transition and observation model. Such error terms can be used to meaningfully compare the value of different policies. In the second half, I will present a bayesian approach designed to simultaneously improve the model and select good actions. Performance of the two methods will be illustrated using problems drawn from the fields of robotics and medical treatment design.

Bio

Venue, Date, and Time

Venue: Newell Simon Hall 1507

Date: Monday, Oct 20, 2008

Time: 12:00 noon

Slides