DeepMovie: Using Optical Flow and Deep Neural Networks to Stylize Movies

05/26/2016
by   Alexander G. Anderson, et al.
0

A recent paper by Gatys et al. describes a method for rendering an image in the style of another image. First, they use convolutional neural network features to build a statistical model for the style of an image. Then they create a new image with the content of one image but the style statistics of another image. Here, we extend this method to render a movie in a given artistic style. The naive solution that independently renders each frame produces poor results because the features of the style move substantially from one frame to the next. The other naive method that initializes the optimization for the next frame using the rendered version of the previous frame also produces poor results because the features of the texture stay fixed relative to the frame of the movie instead of moving with objects in the scene. The main contribution of this paper is to use optical flow to initialize the style transfer optimization so that the texture features move with the objects in the video. Finally, we suggest a method to incorporate optical flow explicitly into the cost function.

READ FULL TEXT

page 4

page 5

page 6

page 10

page 11

research
11/06/2018

Evolvement Constrained Adversarial Learning for Video Style Transfer

Video style transfer is a useful component for applications such as augm...
research
06/21/2017

Two-Stream Convolutional Networks for Dynamic Texture Synthesis

We introduce a two-stream model for dynamic texture synthesis. Our model...
research
12/03/2020

Learning to Transfer Visual Effects from Videos to Images

We study the problem of animating images by transferring spatio-temporal...
research
09/10/2019

Raiders of the Lost Art

Neural style transfer, first proposed by Gatys et al. (2015), can be use...
research
07/10/2020

Optical Flow Distillation: Towards Efficient and Stable Video Style Transfer

Video style transfer techniques inspire many exciting applications on mo...
research
05/24/2020

MVStylizer: An Efficient Edge-Assisted Video Photorealistic Style Transfer System for Mobile Phones

Recent research has made great progress in realizing neural style transf...
research
03/26/2016

Video Interpolation using Optical Flow and Laplacian Smoothness

Non-rigid video interpolation is a common computer vision task. In this ...

Please sign up or login with your details

Forgot password? Click here to reset