摘要
Cycle consistency conducts generative adversarial networks from aligned image pairs to unpaired training sets and can be applied to various image-to-image translations. However, the accumulation of errors that may occur during image reconstruction can affect the realism and quality of the generated images. To address this, we exploit a novel long and short cycle-consistent loss. This new loss is simple and easy to implement. Our dual-cycle constrained cross-domain image-to-image translation method can handle error accumulation and enforce adversarial learning. When image information is migrated from one domain to another, the cycle consistency-based image reconstruction constraint should be constrained in both short and long cycles to eliminate error accumulation. We adopt the cascading manner with dual-cycle consistency, where the reconstructed image in the first cycle can be cast as the new input to the next cycle. We show a distinct improvement over baseline approaches in most translation scenarios. With extensive experiments on several datasets, the proposed method is superior to several tested approaches.
-
单位同济大学