黑料不打烊

Event

Aaron Smith (University of Ottawa)

Friday, November 29, 2024 15:30to16:30
Burnside Hall 805 rue Sherbrooke Ouest, Montreal, QC, H3A 0B9, CA

TITLE / TITRE

Free lunches and subsampling Monte Carlo

础叠厂罢搁础颁罢/搁脡厂鲍惭脡听

It is well-known that the performance of MCMC algorithms degrades quite quickly when targeting computationally expensive posterior distributions, including the posteriors for even simple models when the dataset is large. This has motivated the search for MCMC variants that scale well for large datasets. One simple approach, taken by several research groups, has been to look at only a subsample of the data at every step. This method is known to work quite well for optimization, and variants of stochastic gradient descent are the workhorse of modern machine learning. In this talk, we focus on a simple "no-free-lunch" result which shows that no algorithm of this sort can provide substantial speedups for Bayesian computation. We briefly sketch the main steps in the proof, illustrate how these generic results apply to realistic statistical problems and proposed algorithms, and discuss some special examples that can avoid our generic results and provide a free (or at least cheap) lunch. We also mention recent work "in both directions," extending our basic conclusion to some non-reversible chains and showing explicitly how it can be avoided for more complex posteriors (Based on joint with Patrick Conrad, Andrew Davis, James Johndrow, Zonghao Li, Youssef Marzouk, Natesh Pillai, Pengfei Wang and Azeem Zaman.)

PLACE /聽LIEU

Hybride - CRM, Salle / Room 5340, Pavillon Andr茅 Aisenstadt

Follow us on

Back to top