Style Transfer of Black and White Silhouette Images using CycleGAN and a Randomly Generated Dataset

by   Worasait Suwannik, et al.

CycleGAN can be used to transfer an artistic style to an image. It does not require pairs of source and stylized images to train a model. Taking this advantage, we propose using randomly generated data to train a machine learning model that can transfer traditional art style to a black and white silhouette image. The result is noticeably better than the previous neural style transfer methods. However, there are some areas for improvement, such as removing artifacts and spikes from the transformed image.


page 1

page 2

page 3

page 4


FastCLIPStyler: Towards fast text-based image style transfer using style representation

Artistic style transfer is usually performed between two images, a style...

Neural Artistic Style Transfer with Conditional Adversaria

A neural artistic style transformation (NST) model can modify the appear...

Fluorescence Image Histology Pattern Transformation using Image Style Transfer

Confocal laser endomicroscopy (CLE) allow on-the-fly in vivo intraoperat...

Improving Object Detection in Art Images Using Only Style Transfer

Despite recent advances in object detection using deep learning neural n...

Deepfake Style Transfer Mixture: a First Forensic Ballistics Study on Synthetic Images

Most recent style-transfer techniques based on generative architectures ...

Defective samples simulation through Neural Style Transfer for automatic surface defect segment

Owing to the lack of defect samples in industrial product quality inspec...

Matching Underwater Sonar Images by the Learned Descriptor Based on Style Transfer Method

This paper proposes a method that combines the style transfer technique ...

Please sign up or login with your details

Forgot password? Click here to reset