Bridging Data Gaps in Diffusion Models with Adversarial Noise-Based Transfer Learning

Xiyu Wang, Baijiong Lin, Daochang Liu, Ying Cong Chen, Chang Xu

Research output: Contribution to journalConference articlepeer-review

Abstract

Diffusion Probabilistic Models (DPMs) show significant potential in image generation, yet their performance hinges on having access to large datasets. Previous works, like Generative Adversarial Networks (GANs), have tackled the limited data problem by transferring pretrained models learned with sufficient data. However, those methods are hard to utilize in DPMs because of the distinct differences between DPM-based and GAN-based methods, which show the integral of the unique iterative denoising process and the need for many time steps with no target noise in DPMs. In this paper, we propose a novel DPM-based transfer learning method, called DPMs-ANT, to address the limited data problem. It includes two strategies: similarity-guided training, which boosts transfer with a classifier, and adversarial noise selection, which adaptively chooses targeted noise based on the input image. Extensive experiments in the context of few-shot image generation tasks demonstrate that our method is efficient and excels in terms of image quality and diversity compared to existing GAN-based and DPM-based methods.

Original languageEnglish
Article number4982
Pages (from-to)50944-50959
Number of pages16
JournalProceedings of Machine Learning Research
Volume235
Publication statusPublished - 2024
Externally publishedYes
Event41st International Conference on Machine Learning - Vienna, Austria
Duration: 21 Jul 202427 Jul 2024

Fingerprint

Dive into the research topics of 'Bridging Data Gaps in Diffusion Models with Adversarial Noise-Based Transfer Learning'. Together they form a unique fingerprint.

Cite this