论文标题

对任意风格转移的全力关注

All-to-key Attention for Arbitrary Style Transfer

论文作者

Zhu, Mingrui, He, Xiao, Wang, Nannan, Wang, Xiaoyu, Gao, Xinbo

论文摘要

基于注意力的任意风格转移研究表明,在合成生动的本地风格细节中表现出色。他们通常使用全部关注机制 - 内容功能的每个位置都与样式特征的所有位置完全匹配。但是,全部关注倾向于产生扭曲的样式模式,并且具有二次复杂性,从而限制了任意风格转移的有效性和效率。在本文中,我们提出了一种新颖的全能注意机制 - 内容特征的每个位置都与稳定的样式特征位置相匹配 - 这更符合样式转移的特征。具体而言,它整合了两种新提出的注意形式:分布式和渐进的注意力。分布式注意力将注意力分配给描述当地地区样式分布的关键样式表示;渐进的注意力引起了从粗粒区域到细粒度的关键位置的关注。所得模块称为Stya2k,在保留语义结构和渲染一致的样式模式方面表现出非凡的性能。与最先进的方法进行的定性和定量比较证明了我们方法的出色表现。

Attention-based arbitrary style transfer studies have shown promising performance in synthesizing vivid local style details. They typically use the all-to-all attention mechanism -- each position of content features is fully matched to all positions of style features. However, all-to-all attention tends to generate distorted style patterns and has quadratic complexity, limiting the effectiveness and efficiency of arbitrary style transfer. In this paper, we propose a novel all-to-key attention mechanism -- each position of content features is matched to stable key positions of style features -- that is more in line with the characteristics of style transfer. Specifically, it integrates two newly proposed attention forms: distributed and progressive attention. Distributed attention assigns attention to key style representations that depict the style distribution of local regions; Progressive attention pays attention from coarse-grained regions to fine-grained key positions. The resultant module, dubbed StyA2K, shows extraordinary performance in preserving the semantic structure and rendering consistent style patterns. Qualitative and quantitative comparisons with state-of-the-art methods demonstrate the superior performance of our approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源