Wed 18 Jan 2023 17:10 - 17:35 at Grand Ballroom A - Probabilistic Inference Chair(s): Steven Holtzen

In probabilistic programming languages (PPLs), a critical step in optimization-based inference methods is constructing, for a given model program, a trainable guide program.
Soundness and effectiveness of inference rely on constructing good guides, but the expressive power of a universal PPL poses challenges.
This paper introduces an approach to automatically generating guides for deep amortized inference in a universal PPL.
Guides are generated using a type-directed translation per a novel behavioral type system.
Guide generation extracts and exploits independence structures using a syntactic approach to conditional independence, with a semantic account left to further work.
Despite the control-flow expressiveness allowed by the universal PPL, generated guides are guaranteed to satisfy a critical soundness condition and moreover, consistently improve training and inference over state-of-the-art baselines for a suite of benchmarks.

Wed 18 Jan

Displayed time zone: Eastern Time (US & Canada) change

16:45 - 18:00
Probabilistic InferencePOPL at Grand Ballroom A
Chair(s): Steven Holtzen Northeastern University
16:45
25m
Talk
Affine Monads and Lazy Structures for Bayesian Programming
POPL
Swaraj Dash University of Oxford, Younesse Kaddar University of Oxford, Hugo Paquet University of Oxford, Sam Staton University of Oxford
DOI
17:10
25m
Talk
Type-Preserving, Dependence-Aware Guide Generation for Sound, Effective Amortized Probabilistic InferenceVirtual
POPL
Jianlin Li University of Waterloo, Leni Ven University of Waterloo, Pengyuan Shi University of Waterloo, Yizhou Zhang University of Waterloo
DOI
17:35
25m
Talk
Smoothness Analysis for Probabilistic Programs with Application to Optimised Variational Inference
POPL
Wonyeol Lee Stanford University, Xavier Rival Inria; ENS; CNRS; PSL University, Hongseok Yang KAIST; IBS
DOI