Type-Preserving, Dependence-Aware Guide Generation for Sound, Effective Amortized Probabilistic InferenceVirtual
In probabilistic programming languages (PPLs), a critical step in optimization-based inference methods is constructing, for a given model program, a trainable guide program.
Soundness and effectiveness of inference rely on constructing good guides, but the expressive power of a universal PPL poses challenges.
This paper introduces an approach to automatically generating guides for deep amortized inference in a universal PPL.
Guides are generated using a type-directed translation per a novel behavioral type system.
Guide generation extracts and exploits independence structures using a syntactic approach to conditional independence, with a semantic account left to further work.
Despite the control-flow expressiveness allowed by the universal PPL, generated guides are guaranteed to satisfy a critical soundness condition and moreover, consistently improve training and inference over state-of-the-art baselines for a suite of benchmarks.
Wed 18 JanDisplayed time zone: Eastern Time (US & Canada) change
16:45 - 18:00 | |||
16:45 25mTalk | Affine Monads and Lazy Structures for Bayesian Programming POPL Swaraj Dash University of Oxford, Younesse Kaddar University of Oxford, Hugo Paquet University of Oxford, Sam Staton University of Oxford DOI | ||
17:10 25mTalk | Type-Preserving, Dependence-Aware Guide Generation for Sound, Effective Amortized Probabilistic InferenceVirtual POPL Jianlin Li University of Waterloo, Leni Ven University of Waterloo, Pengyuan Shi University of Waterloo, Yizhou Zhang University of Waterloo DOI | ||
17:35 25mTalk | Smoothness Analysis for Probabilistic Programs with Application to Optimised Variational Inference POPL Wonyeol Lee Stanford University, Xavier Rival Inria; ENS; CNRS; PSL University, Hongseok Yang KAIST; IBS DOI |