论文标题
基于参考的视频着色与时空对应关系
Reference-Based Video Colorization with Spatiotemporal Correspondence
论文作者
论文摘要
我们提出了一个新型的基于参考的视频着色框架,并具有时空对应关系。基于参考的方法为引用用户输入颜色框架的灰度框架上色。现有方法遭受物体之间的颜色泄漏和平均颜色的出现,这些颜色源自空间中非本地语义对应关系。为了解决此问题,我们仅从及时限制的参考框架上的区域扭曲颜色。我们使用两种互补的跟踪方法将掩模作为时间对应关系传播:高性能细分的现成实例跟踪以及新提出的密集跟踪以跟踪各种类型的对象。通过限制与时间相关的区域以引用颜色,我们的方法在整个视频中传播了忠实的颜色。实验表明,我们的方法在定量和定性上优于最先进的方法。
We propose a novel reference-based video colorization framework with spatiotemporal correspondence. Reference-based methods colorize grayscale frames referencing a user input color frame. Existing methods suffer from the color leakage between objects and the emergence of average colors, derived from non-local semantic correspondence in space. To address this issue, we warp colors only from the regions on the reference frame restricted by correspondence in time. We propagate masks as temporal correspondences, using two complementary tracking approaches: off-the-shelf instance tracking for high performance segmentation, and newly proposed dense tracking to track various types of objects. By restricting temporally-related regions for referencing colors, our approach propagates faithful colors throughout the video. Experiments demonstrate that our method outperforms state-of-the-art methods quantitatively and qualitatively.