Stability of Optimal Filter Higher-Order Derivatives

06/25/2018
by   Vladislav Z. B. Tadic, et al.
0

In many scenarios, a state-space model depends on a parameter which needs to be inferred from data. Using stochastic gradient search and the optimal filter (first-order) derivative, the parameter can be estimated online. To analyze the asymptotic behavior of online methods for parameter estimation in non-linear state-space models, it is necessary to establish results on the existence and stability of the optimal filter higher-order derivatives. The existence and stability properties of these derivatives are studied here. We show that the optimal filter higher-order derivatives exist and forget initial conditions exponentially fast. We also show that the optimal filter higher-order derivatives are geometrically ergodic. The obtained results hold under (relatively) mild conditions and apply to state-space models met in practice.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro