Abstract
Style transfer is a problem of rendering a content image in the style of another style image. A natural and common practical task in applications of style transfer is to adjust the strength of stylization. Algorithm of Gatys et al. (2016) provides this ability by changing the weighting factors of content and style losses but is computationally inefficient. Real-time style transfer introduced by Johnson et al. (2016) enables fast stylization of any image by passing it through a pre-trained transformer network. Although fast, this architecture is not able to continuously adjust style strength. We propose an extension to real-time style transfer that allows direct control of style strength at inference, still requiring only a single transformer network. We conduct qualitative and quantitative experiments that demonstrate that the proposed method is capable of smooth stylization strength control and removes certain stylization artifacts appearing in the original real-time style transfer method. Comparisons with alternative real-time style transfer algorithms, capable of adjusting stylization strength, show that our method reproduces style with more details.
Abstract (translated)
样式转换是以另一个样式图像的样式呈现内容图像的问题。在风格转换的应用中,调整风格化的力度是一项自然而常见的实践任务。Gatys等人的算法。(2016)通过改变内容和样式损失的加权因子提供了这种能力,但计算效率低下。Johnson等人介绍的实时样式转换。(2016)通过预先培训的变压器网络,可以快速实现任何图像的风格化。虽然速度很快,但这种架构无法持续调整样式强度。我们提出了对实时样式转换的扩展,允许直接控制推断时的样式强度,仍然只需要一个变压器网络。我们进行了定性和定量的实验,证明该方法能够平滑地控制风格化强度,并消除了原有实时风格转换方法中出现的某些风格化伪影。通过与其他实时风格转换算法的比较,证明了该方法能够有效地调整风格化的强度,并具有更多的细节。
URL
https://arxiv.org/abs/1904.08643