Robustness to Out-of-Distribution Inputs via Task-Aware Generative Uncertainty

12/27/2018
by   Rowan McAllister, et al.
10

Deep learning provides a powerful tool for machine perception when the observations resemble the training data. However, real-world robotic systems must react intelligently to their observations even in unexpected circumstances. This requires a system to reason about its own uncertainty given unfamiliar, out-of-distribution observations. Approximate Bayesian approaches are commonly used to estimate uncertainty for neural network predictions, but can struggle with out-of-distribution observations. Generative models can in principle detect out-of-distribution observations as those with a low estimated density. However, the mere presence of an out-of-distribution input does not by itself indicate an unsafe situation. In this paper, we present a method for uncertainty-aware robotic perception that combines generative modeling and model uncertainty to cope with uncertainty stemming from out-of-distribution states. Our method estimates an uncertainty measure about the model's prediction, taking into account an explicit (generative) model of the observation distribution to handle out-of-distribution inputs. This is accomplished by probabilistically projecting observations onto the training distribution, such that out-of-distribution inputs map to uncertain in-distribution observations, which in turn produce uncertain task-related predictions, but only if task-relevant parts of the image change. We evaluate our method on an action-conditioned collision prediction task with both simulated and real data, and demonstrate that our method of projecting out-of-distribution observations improves the performance of four standard Bayesian and non-Bayesian neural network approaches, offering more favorable trade-offs between the proportion of time a robot can remain autonomous and the proportion of impending crashes successfully avoided.

READ FULL TEXT

page 1

page 5

page 6

page 7

page 8

research
06/21/2023

Density Uncertainty Layers for Reliable Uncertainty Estimation

Assessing the predictive uncertainty of deep neural networks is crucial ...
research
08/18/2020

Heteroscedastic Uncertainty for Robust Generative Latent Dynamics

Learning or identifying dynamics from a sequence of high-dimensional obs...
research
05/11/2017

Bayesian Approaches to Distribution Regression

Distribution regression has recently attracted much interest as a generi...
research
02/21/2019

Bayesian optimisation under uncertain inputs

Bayesian optimisation (BO) has been a successful approach to optimise fu...
research
03/14/2023

Robust Fusion for Bayesian Semantic Mapping

The integration of semantic information in a map allows robots to unders...
research
03/01/2023

R-U-SURE? Uncertainty-Aware Code Suggestions By Maximizing Utility Across Random User Intents

Large language models show impressive results at predicting structured t...
research
03/20/2023

Learning to Explore Informative Trajectories and Samples for Embodied Perception

We are witnessing significant progress on perception models, specificall...

Please sign up or login with your details

Forgot password? Click here to reset