Differentiable Particle Filtering without Modifying the Forward Pass

06/18/2021
by   Adam Ścibior, et al.
0

In recent years particle filters have being used as components in systems optimized end-to-end with gradient descent. However, the resampling step in a particle filter is not differentiable, which biases gradients and interferes with optimization. To remedy this problem, several differentiable variants of resampling have been proposed, all of which modify the behavior of the particle filter in significant and potentially undesirable ways. In this paper, we show how to obtain unbiased estimators of the gradient of the marginal likelihood by only modifying messages used in backpropagation, leaving the standard forward pass of a particle filter unchanged. Our method is simple to implement, has a low computational overhead, does not introduce additional hyperparameters, and extends to derivatives of higher orders. We call it stop-gradient resampling, since it can easily be implemented with automatic differentiation libraries using the stop-gradient operator instead of explicitly modifying the backward messages.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2020

Towards Differentiable Resampling

Resampling is a key component of sample-based recursive state estimation...
research
02/19/2023

An overview of differentiable particle filters for data-adaptive sequential Bayesian inference

By approximating posterior distributions with weighted samples, particle...
research
10/24/2022

Scaling up and Stabilizing Differentiable Planning with Implicit Differentiation

Differentiable planning promises end-to-end differentiability and adapti...
research
02/15/2021

Differentiable Particle Filtering via Entropy-Regularized Optimal Transport

Particle Filtering (PF) methods are an established class of procedures f...
research
11/11/2020

End-To-End Semi-supervised Learning for Differentiable Particle Filters

Recent advances in incorporating neural networks into particle filters p...
research
06/26/2023

PMaF: Deep Declarative Layers for Principal Matrix Features

We explore two differentiable deep declarative layers, namely least squa...
research
11/24/2021

Softmax Gradient Tampering: Decoupling the Backward Pass for Improved Fitting

We introduce Softmax Gradient Tampering, a technique for modifying the g...

Please sign up or login with your details

Forgot password? Click here to reset