Assessing a Single Image in Reference-Guided Image Synthesis

12/08/2021
by   Jiayi Guo, et al.
0

Assessing the performance of Generative Adversarial Networks (GANs) has been an important topic due to its practical significance. Although several evaluation metrics have been proposed, they generally assess the quality of the whole generated image distribution. For Reference-guided Image Synthesis (RIS) tasks, i.e., rendering a source image in the style of another reference image, where assessing the quality of a single generated image is crucial, these metrics are not applicable. In this paper, we propose a general learning-based framework, Reference-guided Image Synthesis Assessment (RISA) to quantitatively evaluate the quality of a single generated image. Notably, the training of RISA does not require human annotations. In specific, the training data for RISA are acquired by the intermediate models from the training procedure in RIS, and weakly annotated by the number of models' iterations, based on the positive correlation between image quality and iterations. As this annotation is too coarse as a supervision signal, we introduce two techniques: 1) a pixel-wise interpolation scheme to refine the coarse labels, and 2) multiple binary classifiers to replace a naïve regressor. In addition, an unsupervised contrastive loss is introduced to effectively capture the style similarity between a generated image and its reference image. Empirical results on various datasets demonstrate that RISA is highly consistent with human preference and transfers well across models.

READ FULL TEXT

page 3

page 6

page 7

research
03/19/2020

GIQA: Generated Image Quality Assessment

Generative adversarial networks (GANs) have achieved impressive results ...
research
05/10/2019

Neuroscore: A Brain-inspired Evaluation Metric for Generative Adversarial Networks

Generative adversarial networks (GANs) are increasingly attracting atten...
research
03/28/2019

GANs-NQM: A Generative Adversarial Networks based No Reference Quality Assessment Metric for RGB-D Synthesized Views

In this paper, we proposed a no-reference (NR) quality metric for RGB pl...
research
03/05/2020

A Neuro-AI Interface for Evaluating Generative Adversarial Networks

Generative adversarial networks (GANs) are increasingly attracting atten...
research
06/20/2021

ReGO: Reference-Guided Outpainting for Scenery Image

We aim to tackle the challenging yet practical scenery image outpainting...
research
04/10/2019

Image Quality Assessment for Omnidirectional Cross-reference Stitching

Along with the development of virtual reality (VR), omnidirectional imag...
research
06/19/2018

FrankenGAN: Guided Detail Synthesis for Building Mass-Models Using Style-Synchonized GANs

Coarse building mass models are now routinely generated at scales rangin...

Please sign up or login with your details

Forgot password? Click here to reset