π VAE: a stochastic process prior for Bayesian deep learning with MCMC

Research output: Contribution to journalJournal articleResearchpeer-review

Documents

  • Fulltext

    Final published version, 4.36 MB, PDF document

  • Swapnil Mishra
  • Seth Flaxman
  • Tresnia Berah
  • Harrison Zhu
  • Mikko Pakkanen
  • Bhatt, Samir

Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder (pi VAE). pi VAE is a new continuous stochastic process. We use pi VAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, pi VAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan.

Original languageEnglish
Article number96
JournalStatistics and Computing
Volume32
Issue number6
Number of pages16
ISSN0960-3174
DOIs
Publication statusPublished - 2022

    Research areas

  • Bayesian inference, MCMC, VAE, Spatio-temporal, GAUSSIAN-PROCESSES

Number of downloads are based on statistics from Google Scholar and www.ku.dk


No data available

ID: 323611261