TY - JOUR
T1 - Bridging Data Gaps in Diffusion Models with Adversarial Noise-Based Transfer Learning
AU - Wang, Xiyu
AU - Lin, Baijiong
AU - Liu, Daochang
AU - Chen, Ying Cong
AU - Xu, Chang
N1 - Publisher Copyright:
Copyright 2024 by the author(s)
PY - 2024
Y1 - 2024
N2 - Diffusion Probabilistic Models (DPMs) show significant potential in image generation, yet their performance hinges on having access to large datasets. Previous works, like Generative Adversarial Networks (GANs), have tackled the limited data problem by transferring pretrained models learned with sufficient data. However, those methods are hard to utilize in DPMs because of the distinct differences between DPM-based and GAN-based methods, which show the integral of the unique iterative denoising process and the need for many time steps with no target noise in DPMs. In this paper, we propose a novel DPM-based transfer learning method, called DPMs-ANT, to address the limited data problem. It includes two strategies: similarity-guided training, which boosts transfer with a classifier, and adversarial noise selection, which adaptively chooses targeted noise based on the input image. Extensive experiments in the context of few-shot image generation tasks demonstrate that our method is efficient and excels in terms of image quality and diversity compared to existing GAN-based and DPM-based methods.
AB - Diffusion Probabilistic Models (DPMs) show significant potential in image generation, yet their performance hinges on having access to large datasets. Previous works, like Generative Adversarial Networks (GANs), have tackled the limited data problem by transferring pretrained models learned with sufficient data. However, those methods are hard to utilize in DPMs because of the distinct differences between DPM-based and GAN-based methods, which show the integral of the unique iterative denoising process and the need for many time steps with no target noise in DPMs. In this paper, we propose a novel DPM-based transfer learning method, called DPMs-ANT, to address the limited data problem. It includes two strategies: similarity-guided training, which boosts transfer with a classifier, and adversarial noise selection, which adaptively chooses targeted noise based on the input image. Extensive experiments in the context of few-shot image generation tasks demonstrate that our method is efficient and excels in terms of image quality and diversity compared to existing GAN-based and DPM-based methods.
UR - http://www.scopus.com/inward/record.url?scp=85203788044&partnerID=8YFLogxK
UR - https://proceedings.mlr.press/v235/
M3 - Conference article
AN - SCOPUS:85203788044
SN - 2640-3498
VL - 235
SP - 50944
EP - 50959
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
M1 - 4982
T2 - 41st International Conference on Machine Learning
Y2 - 21 July 2024 through 27 July 2024
ER -