Arbitrary Style Transfer with Style-Attentional Networks

12/06/2018
by   Dae Young Park, et al.
4

Arbitrary style transfer is the problem of synthesizing content image with style of the image that have never seen before. Recent arbitrary style transfer algorithms have trade-off between the content structure and the style patterns, or maintaining the global and local style patterns at the same time is difficult due to the patch-based mechanism. In this paper, we introduce a novel style-attentional network (SANet), which efficiently and flexibly decorates the local style patterns according to the semantic spatial distribution of the content image. A new identity loss function and a multi-level features embedding also make our SANet and decoder preserve the content structure as much as possible while enriching the style patterns. Experimental results demonstrate that our algorithm synthesizes higher-quality stylized images in real-time than the state-of-the-art-algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset