Volltext-Downloads (blau) und Frontdoor-Views (grau)
  • search hit 6 of 31
Back to Result List

Transformation Models for Flexible Posteriors in Variational Bayes

  • The main challenge in Bayesian models is to determine the posterior for the model parameters. Already, in models with only one or few parameters, the analytical posterior can only be determined in special settings. In Bayesian neural networks, variational inference is widely used to approximate difficult-to-compute posteriors by variational distributions. Usually, Gaussians are used as variational distributions (Gaussian-VI) which limits the quality of the approximation due to their limited flexibility. Transformation models on the other hand are flexible enough to fit any distribution. Here we present transformation model-based variational inference (TM-VI) and demonstrate that it allows to accurately approximate complex posteriors in models with one parameter and also works in a mean-field fashion for multi-parameter models like neural networks.

Export metadata

Additional Services

Share in Twitter Search Google Scholar

Statistics

frontdoor_oas
Metadaten
Author:Stefan Hörtling, Daniel Dold, Oliver DürrGND, Beate SickGND
URL:https://arxiv.org/abs/2106.00528v1
Document Type:Preprint
Language:English
Year of Publication:2021
Release Date:2022/01/08
Tag:Machine learning
Page Number:5
Institutes:Institut für Optische Systeme - IOS
DDC functional group:500 Naturwissenschaften und Mathematik
Open Access?:Ja
Relevance:Sonstige Publikation