Deep Video Color Propagation

08/09/2018
by   Simone Meyer, et al.
8

Traditional approaches for color propagation in videos rely on some form of matching between consecutive video frames. Using appearance descriptors, colors are then propagated both spatially and temporally. These methods, however, are computationally expensive and do not take advantage of semantic information of the scene. In this work we propose a deep learning framework for color propagation that combines a local strategy, to propagate colors frame-by-frame ensuring temporal stability, and a global strategy, using semantics for color propagation within a longer range. Our evaluation shows the superiority of our strategy over existing video and image color propagation methods as well as neural photo-realistic style transfer approaches.

READ FULL TEXT

page 2

page 4

page 6

page 8

page 9

research
08/04/2016

Recoding Color Transfer as a Color Homography

Color transfer is an image editing process that adjusts the colors of a ...
research
06/02/2023

Video Colorization with Pre-trained Text-to-Image Diffusion Models

Video colorization is a challenging task that involves inferring plausib...
research
04/23/2018

Switchable Temporal Propagation Network

Videos contain highly redundant information between frames. Such redunda...
research
12/05/2014

Background Modelling using Octree Color Quantization

By assuming that the most frequently occuring color in a video or a regi...
research
05/23/2023

FlowChroma – A Deep Recurrent Neural Network for Video Colorization

We develop an automated video colorization framework that minimizes the ...
research
11/28/2010

Video Stippling

In this paper, we consider rendering color videos using a non-photo-real...
research
06/05/2023

Color-aware Deep Temporal Backdrop Duplex Matting System

Deep learning-based alpha matting showed tremendous improvements in rece...

Please sign up or login with your details

Forgot password? Click here to reset