Tomorrow @latentjasper@balajiln and I present a #NeurIPS2020 tutorial on "Practical Uncertainty Estimation and Out-of-Distribution Robustness in Deep Learning". Whether you're new to the area or an expert, there is critically useful info! 8-10:30a PT nips.cc/virtual/2020/t…
The talk is split into three sections: 1. Why Uncertainty & Robustness; 2. Foundations; and 3. Recent Advances.
Tutorials do _not_ need registration to attend!
See everyone at the conference!
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Check out BatchEnsemble: Efficient Ensembling with Rank 1 Perturbations at the #NeurIPS2019 Bayesian DL workshop. Better accuracies and uncertainty than dropout and competitive with ensembles across a wide range of tasks. 1/-
It’s a drop in replacement for individual layers, like dropout, batchnorm, and variational layers and is available with baselines and Bayesian layers at github.com/google/edward2. 2/-
Unlike ensembles, it’s trained end to end under a single loss function (NLL) and computation can be parallelized across ensemble members in GPU/TPUs. BatchEnsemble is like a new parameterization for neural nets. 3/-