Defensive Perception: Estimation and Monitoring of Neural Network Performance under Deployment

by   Hendrik Vogt, et al.

In this paper, we propose a method for addressing the issue of unnoticed catastrophic deployment and domain shift in neural networks for semantic segmentation in autonomous driving. Our approach is based on the idea that deep learning-based perception for autonomous driving is uncertain and best represented as a probability distribution. As autonomous vehicles' safety is paramount, it is crucial for perception systems to recognize when the vehicle is leaving its operational design domain, anticipate hazardous uncertainty, and reduce the performance of the perception system. To address this, we propose to encapsulate the neural network under deployment within an uncertainty estimation envelope that is based on the epistemic uncertainty estimation through the Monte Carlo Dropout approach. This approach does not require modification of the deployed neural network and guarantees expected model performance. Our defensive perception envelope has the capability to estimate a neural network's performance, enabling monitoring and notification of entering domains of reduced neural network performance under deployment. Furthermore, our envelope is extended by novel methods to improve the application in deployment settings, including reducing compute expenses and confining estimation noise. Finally, we demonstrate the applicability of our method for multiple different potential deployment shifts relevant to autonomous driving, such as transitions into the night, rainy, or snowy domain. Overall, our approach shows great potential for application in deployment settings and enables operational design domain recognition via uncertainty, which allows for defensive perception, safe state triggers, warning notifications, and feedback for testing or development and adaptation of the perception stack.


Deep Neural Network Perception Models and Robust Autonomous Driving Systems

This paper analyzes the robustness of deep learning models in autonomous...

Can Autonomous Vehicles Identify, Recover From, and Adapt to Distribution Shifts?

Out-of-training-distribution (OOD) scenarios are a common challenge of l...

Estimating Uncertainty of Autonomous Vehicle Systems with Generalized Polynomial Chaos

Modern autonomous vehicle systems use complex perception and control com...

Positive Trust Balance for Self-Driving Car Deployment

The crucial decision about when self-driving cars are ready to deploy is...

Learning Uncertainty For Safety-Oriented Semantic Segmentation In Autonomous Driving

In this paper, we show how uncertainty estimation can be leveraged to en...

SHIFT: A Synthetic Driving Dataset for Continuous Multi-Task Domain Adaptation

Adapting to a continuously evolving environment is a safety-critical cha...

Uncertainty-aware Perception Models for Off-road Autonomous Unmanned Ground Vehicles

Off-road autonomous unmanned ground vehicles (UGVs) are being developed ...

Please sign up or login with your details

Forgot password? Click here to reset