×
Nov 11, 2018 · Empirical evaluations on standard benchmarks confirm that SLANG enables faster and more accurate estimation of uncertainty than mean-field ...
Our method also enables fast estimation by using an approximate natural-gradient algorithm that builds the covariance estimate solely based on the back- ...
To make the inference faster, the authors also built a natural gradient update algorithm based on the method proposed in [1]. Strengths: In my opinion, this ...
This work proposes a new stochastic, low-rank, approximate natural-gradient (SLANG) method for variational inference in large deep models that enables ...
Dec 3, 2018 · To address this issue, we propose a new stochastic, low-rank, approximate natural-gradient (SLANG) method for variational inference in large, ...
Code for reproducing the experimental results in the paper SLANG: Fast Structured Covariance Approximations for Bayesian Deep Learning with Natural Gradient
2. SLANG: Fast Structured Covariance Approximations for Bayesian Deep Learning with Natural Gradient, (NeurIPS 2018), Mishkin, Kunstner, Nielsen,. Schmidt, Khan ...
Nov 23, 2018 · A three minute summary of our NIPS 2018 paper, "SLANG: Fast Structured Covariance Approximations for Bayesian Deep Learning with Natural ...
Slang: Fast structured covariance approximations for bayesian deep learning with natural gradient. A Mishkin, F Kunstner, D Nielsen, M Schmidt, ME Khan.
SLANG: Fast Structured Covariance Approximations for Bayesian Deep Learning with Natural Gradient. A. Mishkin, F. Kunstner, D. Nielsen, M. Schmidt, M. E. ...