We study the foundations of variational inference, which frames posterior inference as an optimisation problem, for probabilistic programming. In practice, the optimisation problem is addressed using gradient based algorithms. In particular, the reparameterisation gradient estimator, which typically reduces variance, is applied very successfully. Unfortunately, non-differentiable models, which are readily expressible in programming languages, can compromise the correctness of this approach. We study the continuous but possibly non-differentiable setting: we provide categorical models, prove unbiasedness of the reparameterisation gradient estimator and demonstrate how to establish continuity in a language with conditionals compositionally. Abstractly, this provides a foundation for fast yet correct inference for non-differentiable continuous models.
On the Reparameterisation Gradient for Non-Differentiable but Continuous Models (lafi23-final22.pdf)