- Class schedule: Tuesday, Apr. 22nd, 2014 from 02:00 pm to 03:00 pm
- Location: Building 1, Room 4214
- Refreshments: Available @ 01:45 pm
Abstract
Bayesian inference provides a natural framework for quantifying uncertainty in parameter estimates and model predictions, for combining heterogeneous sources of information, and for conditioning sequential predictions on data. Posterior simulation in Bayesian inference often proceeds via Markov chain Monte Carlo (MCMC) or sequential Monte Carlo (SMC), but the associated computational expense and convergence issues can present significant bottlenecks in large-scale or dynamically complex problems.
We present a new approach to Bayesian inference that entirely avoids MCMC or SMC simulation, by constructing a deterministic map that pushes forward the prior measure (or another reference measure) to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. We discuss various means of explicitly parameterizing the map and computing it efficiently through solution of a stochastic optimization problem; in particular, we use a sample average approximation approach that exploits gradient information from the likelihood function when available. Advantages of the resulting scheme include analytical expressions for posterior moments, clear convergence criteria for posterior approximation, the ability to generate arbitrary numbers of independent samples, and automatic evaluation of the marginal likelihood to facilitate model comparison.
In the second part of the talk, we will present two new schemes for nonlinear filtering using transport maps. First is a two-stage approach that uses one transport map to push the forecast distribution to a reference distribution and a another map to perform the Bayesian update; we show that this scheme can be viewed as a generalization of the ensemble Kalman filter that converges to the Bayesian posterior. Next, we present a single-stage approach that maps a higher-dimensional reference measure to the joint distribution of the states between one assimilation step and the next---effectively performing smoothing over a limited time interval. Numerical examples show good filtering performance and convergence to the true Bayesian posterior in canonical chaotic dynamical systems (e.g., Lorenz-96) and in the filtering of rare events.
Short Bio
Youssef Marzouk is the Class of 1942 Associate Professor in the Department of Aeronautics and Astronautics at MIT, and Director of MIT's Aerospace Computational Design Laboratory. His research focuses on computational methodology for uncertainty quantification and statistical inference in complex physical systems, and using these tools to address modeling challenges in energy conversion and environmental applications. He received his SB (1997), SM (1999), and PhD (2004) degrees in mechanical engineering from MIT, and spent several years at Sandia National Laboratories before joining the MIT faculty in 2009. He is a recipient of the Hertz Foundation Doctoral Thesis Prize (2004), the US Department of Energy Early Career Research Award (2010), and the Junior Bose Award for Teaching Excellence from the MIT School of Engineering (2012).