TY - CHAP U1 - Konferenzveröffentlichung A1 - Brach, Kai A1 - Sick, Beate A1 - Dürr, Oliver T1 - Single Shot MC Dropout Approximation T2 - ICML 2020 Workshop on Uncertainty and Ro-bustness in Deep Learning, July 17, 2020, virtual N2 - Deep neural networks (DNNs) are known for their high prediction performance, especially in perceptual tasks such as object recognition or autonomous driving. Still, DNNs are prone to yield unreliable predictions when encountering completely new situations without indicating their uncertainty. Bayesian variants of DNNs (BDNNs), such as MC dropout BDNNs, do provide uncertainty measures. However, BDNNs are slow during test time because they rely on a sampling approach. Here we present a single shot MC dropout approximation that preserves the advantages of BDNNs without being slower than a DNN. Our approach is to analytically approximate for each layer in a fully connected network the expected value and the variance of the MC dropout signal. We evaluate our approach on different benchmark datasets and a simulated toy example. We demonstrate that our single shot MC dropout approximation resembles the point estimate and the uncertainty estimate of the predictive distribution that is achieved with an MC approach, while being fast enough for real-time deployments of BDNNs. Y1 - 2020 UR - https://arxiv.org/abs/2007.03293 UR - http://www.gatsby.ucl.ac.uk/~balaji/udl2020/accepted-papers/UDL2020-paper-106.pdf SP - 6 S1 - 6 ER -