Self-Supervised learning for Neural Architecture Search (NAS)

04/03/2023
by   Samuel Ducros, et al.
0

The objective of this internship is to propose an innovative method that uses unlabelled data, i.e. data that will allow the AI to automatically learn to predict the correct outcome. To reach this stage, the steps to be followed can be defined as follows: (1) consult the state of the art and position ourself against it, (2) come up with ideas for development paths, (3) implement these ideas, (4) and finally test them to position ourself against the state of the art, and then start the sequence again. During my internship, this sequence was done several times and therefore gives the tracks explored during the internship.

READ FULL TEXT

page 5

page 6

page 7

page 9

page 10

page 11

page 12

research
07/03/2020

Self-supervised Neural Architecture Search

Neural Architecture Search (NAS) has been used recently to achieve impro...
research
06/08/2022

Towards Self-supervised and Weight-preserving Neural Architecture Search

Neural architecture search (NAS) algorithms save tremendous labor from h...
research
02/21/2021

Contrastive Self-supervised Neural Architecture Search

This paper proposes a novel cell-based neural architecture search algori...
research
09/17/2021

Self-Supervised Neural Architecture Search for Imbalanced Datasets

Neural Architecture Search (NAS) provides state-of-the-art results when ...
research
09/25/2022

Bigger Faster: Two-stage Neural Architecture Search for Quantized Transformer Models

Neural architecture search (NAS) for transformers has been used to creat...
research
12/04/2022

Vibration suppression of a state-of-the-art wafer gripper

In this paper the implementation of piezoelectrics to a state-of-the-art...

Please sign up or login with your details

Forgot password? Click here to reset