Parser-Free Virtual Try-on via Distilling Appearance Flows

03/08/2021
by   Yuying Ge, et al.
11

Image virtual try-on aims to fit a garment image (target clothes) to a person image. Prior methods are heavily based on human parsing. However, slightly-wrong segmentation results would lead to unrealistic try-on images with large artifacts. Inaccurate parsing misleads parser-based methods to produce visually unrealistic results where artifacts usually occur. A recent pioneering work employed knowledge distillation to reduce the dependency of human parsing, where the try-on images produced by a parser-based method are used as supervisions to train a "student" network without relying on segmentation, making the student mimic the try-on ability of the parser-based model. However, the image quality of the student is bounded by the parser-based model. To address this problem, we propose a novel approach, "teacher-tutor-student" knowledge distillation, which is able to produce highly photo-realistic images without human parsing, possessing several appealing advantages compared to prior arts. (1) Unlike existing work, our approach treats the fake images produced by the parser-based method as "tutor knowledge", where the artifacts can be corrected by real "teacher knowledge", which is extracted from the real person images in a self-supervised way. (2) Other than using real images as supervisions, we formulate knowledge distillation in the try-on problem as distilling the appearance flows between the person image and the garment image, enabling us to find accurate dense correspondences between them to produce high-quality results. (3) Extensive evaluations show large superiority of our method (see Fig. 1).

READ FULL TEXT

page 1

page 2

page 3

page 5

page 6

page 7

page 8

research
07/03/2020

Do Not Mask What You Do Not Need to Mask: a Parser-Free Virtual Try-On

The 2D virtual try-on task has recently attracted a great interest from ...
research
10/15/2020

Spherical Knowledge Distillation

Knowledge distillation aims at obtaining a small but effective deep mode...
research
09/30/2021

Multilingual AMR Parsing with Noisy Knowledge Distillation

We study multilingual AMR parsing from the perspective of knowledge dist...
research
08/26/2023

DM-VTON: Distilled Mobile Real-time Virtual Try-On

The fashion e-commerce industry has witnessed significant growth in rece...
research
06/01/2020

Distilling Neural Networks for Greener and Faster Dependency Parsing

The carbon footprint of natural language processing research has been in...
research
02/27/2023

Image-Based Virtual Try-on System With Clothing-Size Adjustment

The conventional image-based virtual try-on method cannot generate fitti...
research
06/21/2022

KTN: Knowledge Transfer Network for Learning Multi-person 2D-3D Correspondences

Human densepose estimation, aiming at establishing dense correspondences...

Please sign up or login with your details

Forgot password? Click here to reset