Blurs Make Results Clearer: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness

05/26/2021
by   Namuk Park, et al.
0

Bayesian neural networks (BNNs) have shown success in the areas of uncertainty estimation and robustness. However, a crucial challenge prohibits their use in practice: Bayesian NNs require a large number of predictions to produce reliable results, leading to a significant increase in computational cost. To alleviate this issue, we propose spatial smoothing, a method that ensembles neighboring feature map points of CNNs. By simply adding a few blur layers to the models, we empirically show that the spatial smoothing improves accuracy, uncertainty estimation, and robustness of BNNs across a whole range of ensemble sizes. In particular, BNNs incorporating the spatial smoothing achieve high predictive performance merely with a handful of ensembles. Moreover, this method also can be applied to canonical deterministic neural networks to improve the performances. A number of evidences suggest that the improvements can be attributed to the smoothing and flattening of the loss landscape. In addition, we provide a fundamental explanation for prior works - namely, global average pooling, pre-activation, and ReLU6 - by addressing to them as special cases of the spatial smoothing. These not only enhance accuracy, but also improve uncertainty estimation and robustness by making the loss landscape smoother in the same manner as the spatial smoothing. The code is available at https://github.com/xxxnell/spatial-smoothing.

READ FULL TEXT
research
07/14/2022

Blurs Behave Like Ensembles: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness

Neural network ensembles, such as Bayesian neural networks (BNNs), have ...
research
03/12/2020

Post-Estimation Smoothing: A Simple Baseline for Learning with Side Information

Observational data are often accompanied by natural structural indices, ...
research
04/02/2021

Misclassification-Aware Gaussian Smoothing improves Robustness against Domain Shifts

Deep neural networks achieve high prediction accuracy when the train and...
research
02/14/2022

How Do Vision Transformers Work?

The success of multi-head self-attentions (MSAs) for computer vision is ...
research
12/05/2019

Deep Ensembles: A Loss Landscape Perspective

Deep ensembles have been empirically shown to be a promising approach fo...
research
06/01/2022

Sequential Bayesian Neural Subnetwork Ensembles

Deep neural network ensembles that appeal to model diversity have been u...
research
05/27/2022

(De-)Randomized Smoothing for Decision Stump Ensembles

Tree-based models are used in many high-stakes application domains such ...

Please sign up or login with your details

Forgot password? Click here to reset