Efficient and Modular Implicit Differentiation

by   Mathieu Blondel, et al.

Automatic differentiation (autodiff) has revolutionized machine learning. It allows expressing complex computations by composing elementary ones in creative ways and removes the burden of computing their derivatives by hand. More recently, differentiation of optimization problem solutions has attracted widespread attention with applications such as optimization as a layer, and in bi-level problems such as hyper-parameter optimization and meta-learning. However, the formulas for these derivatives often involve case-by-case tedious mathematical derivations. In this paper, we propose a unified, efficient and modular approach for implicit differentiation of optimization problems. In our approach, the user defines (in Python in the case of our implementation) a function F capturing the optimality conditions of the problem to be differentiated. Once this is done, we leverage autodiff of F and implicit differentiation to automatically differentiate the optimization problem. Our approach thus combines the benefits of implicit differentiation and autodiff. It is efficient as it can be added on top of any state-of-the-art solver and modular as the optimality condition specification is decoupled from the implicit differentiation mechanism. We show that seemingly simple principles allow to recover many recently proposed implicit differentiation methods and create new ones easily. We demonstrate the ease of formulating and solving bi-level optimization problems using our framework. We also showcase an application to the sensitivity analysis of molecular dynamics.


page 1

page 2

page 3

page 4


Nonsmooth Implicit Differentiation for Machine Learning and Optimization

In view of training increasingly complex learning architectures, we esta...

A Unified Framework for Implicit Sinkhorn Differentiation

The Sinkhorn operator has recently experienced a surge of popularity in ...

Efficient Automatic Differentiation of Implicit Functions

Derivative-based algorithms are ubiquitous in statistics, machine learni...

Amortized Implicit Differentiation for Stochastic Bilevel Optimization

We study a class of algorithms for solving bilevel optimization problems...

Improved Marginal Unbiased Score Expansion (MUSE) via Implicit Differentiation

We apply the technique of implicit differentiation to boost performance,...

Analyzing Inexact Hypergradients for Bilevel Learning

Estimating hyperparameters has been a long-standing problem in machine l...

Automatically Bounding the Taylor Remainder Series: Tighter Bounds and New Applications

We present a new algorithm for automatically bounding the Taylor remaind...

Please sign up or login with your details

Forgot password? Click here to reset