P-KDGAN: Progressive Knowledge Distillation with GANs for One-class Novelty Detection

07/14/2020
by   Zhiwei Zhang, et al.
0

One-class novelty detection is to identify anomalous instances that do not conform to the expected normal instances. In this paper, the Generative Adversarial Networks (GANs) based on encoder-decoder-encoder pipeline are used for detection and achieve state-of-the-art performance. However, deep neural networks are too over-parameterized to deploy on resource-limited devices. Therefore, Progressive Knowledge Distillation with GANs (PKDGAN) is proposed to learn compact and fast novelty detection networks. The P-KDGAN is a novel attempt to connect two standard GANs by the designed distillation loss for transferring knowledge from the teacher to the student. The progressive learning of knowledge distillation is a two-step approach that continuously improves the performance of the student GAN and achieves better performance than single step methods. In the first step, the student GAN learns the basic knowledge totally from the teacher via guiding of the pretrained teacher GAN with fixed weights. In the second step, joint fine-training is adopted for the knowledgeable teacher and student GANs to further improve the performance and stability. The experimental results on CIFAR-10, MNIST, and FMNIST show that our method improves the performance of the student GAN by 2.44 1.73 700:1, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/15/2021

New Perspective on Progressive GANs Distillation for One-class Novelty Detection

One-class novelty detection is conducted to identify anomalous instances...
research
02/01/2019

Compressing GANs using Knowledge Distillation

Generative Adversarial Networks (GANs) have been used in several machine...
research
03/16/2022

PPCD-GAN: Progressive Pruning and Class-Aware Distillation for Large-Scale Conditional GANs Compression

We push forward neural network compression research by exploiting a nove...
research
03/03/2023

Unsupervised Deep Digital Staining For Microscopic Cell Images Via Knowledge Distillation

Staining is critical to cell imaging and medical diagnosis, which is exp...
research
07/09/2021

Lifelong Twin Generative Adversarial Networks

In this paper, we propose a new continuously learning generative model, ...
research
08/18/2022

Mind the Gap in Distilling StyleGANs

StyleGAN family is one of the most popular Generative Adversarial Networ...
research
08/10/2020

T-GD: Transferable GAN-generated Images Detection Framework

Recent advancements in Generative Adversarial Networks (GANs) enable the...

Please sign up or login with your details

Forgot password? Click here to reset