Palm up: Playing in the Latent Manifold for Unsupervised Pretraining

10/19/2022
by   Hao Liu, et al.
0

Large and diverse datasets have been the cornerstones of many impressive advancements in artificial intelligence. Intelligent creatures, however, learn by interacting with the environment, which changes the input sensory signals and the state of the environment. In this work, we aim to bring the best of both worlds and propose an algorithm that exhibits an exploratory behavior whilst it utilizes large diverse datasets. Our key idea is to leverage deep generative models that are pretrained on static datasets and introduce a dynamic model in the latent space. The transition dynamics simply mixes an action and a random sampled latent. It then applies an exponential moving average for temporal persistency, the resulting latent is decoded to image using pretrained generator. We then employ an unsupervised reinforcement learning algorithm to explore in this environment and perform unsupervised representation learning on the collected data. We further leverage the temporal information of this data to pair data points as a natural supervision for representation learning. Our experiments suggest that the learned representations can be successfully transferred to downstream tasks in both vision and reinforcement learning domains.

READ FULL TEXT

page 4

page 6

page 8

research
06/09/2023

On the Importance of Feature Decorrelation for Unsupervised Representation Learning in Reinforcement Learning

Recently, unsupervised representation learning (URL) has improved the sa...
research
05/03/2021

Curious Representation Learning for Embodied Intelligence

Self-supervised representation learning has achieved remarkable success ...
research
07/14/2023

DreamTeacher: Pretraining Image Backbones with Deep Generative Models

In this work, we introduce a self-supervised feature representation lear...
research
04/16/2020

Classification Representations Can be Reused for Downstream Generations

Contrary to the convention of using supervision for class-conditioned ge...
research
04/30/2020

Bootstrap Latent-Predictive Representations for Multitask Reinforcement Learning

Learning a good representation is an essential component for deep reinfo...
research
05/26/2023

Inverse Dynamics Pretraining Learns Good Representations for Multitask Imitation

In recent years, domains such as natural language processing and image r...
research
07/17/2020

Unsupervised Representation Learning For Context of Vocal Music

In this paper we aim to learn meaningful representations of sung intonat...

Please sign up or login with your details

Forgot password? Click here to reset