Transferring Knowledge across Learning Processes

12/03/2018
by   Sebastian Flennerhag, et al.
126

In complex transfer learning scenarios new tasks might not be tightly linked to previous tasks. Approaches that transfer information contained only in the final parameters of a source model will therefore struggle. Instead, transfer learning at a higher level of abstraction is needed. We propose Leap, a framework that achieves this by transferring knowledge across learning processes. We associate each task with a manifold on which the training process travels from initialization to final parameters and construct a meta learning objective that minimizes the expected length of this path. Our framework leverages only information obtained during training and can be computed on the fly at negligible cost. We demonstrate that our framework outperforms competing methods, both in meta learning and transfer learning, on a set of computer vision tasks. Finally, we demonstrate that Leap can transfer knowledge across learning processes in demanding Reinforcement Learning environments (Atari) that involve millions of gradient steps.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2019

Transfer Learning for Algorithm Recommendation

Meta-Learning is a subarea of Machine Learning that aims to take advanta...
research
06/29/2020

Robustifying Sequential Neural Processes

When tasks change over time, meta-transfer learning seeks to improve the...
research
07/05/2022

A Unified Meta-Learning Framework for Dynamic Transfer Learning

Transfer learning refers to the transfer of knowledge or information fro...
research
10/15/2021

Multilingual Speech Recognition using Knowledge Transfer across Learning Processes

Multilingual end-to-end(E2E) models have shown a great potential in the ...
research
11/23/2021

Sharing to learn and learning to share - Fitting together Meta-Learning, Multi-Task Learning, and Transfer Learning : A meta review

Integrating knowledge across different domains is an essential feature o...
research
03/29/2022

Kernel Modulation: A Parameter-Efficient Method for Training Convolutional Neural Networks

Deep Neural Networks, particularly Convolutional Neural Networks (ConvNe...
research
11/18/2021

Merging Models with Fisher-Weighted Averaging

Transfer learning provides a way of leveraging knowledge from one task w...

Please sign up or login with your details

Forgot password? Click here to reset