TY - JOUR
T1 - A generalized optimization-based generative adversarial network
AU - Farhadinia, Bahram
AU - Ahangari, Mohammad Reza
AU - Heydari, Aghileh
AU - Datta, Amitava
PY - 2024/8/15
Y1 - 2024/8/15
N2 - Interest in Generative Adversarial Networks (GANs) continues to grow, with diverse GAN variations emerging for applications across various domains. However, substantial challenges persist in advancing GANs. Effective training of deep learning models, including GANs, heavily relies on well-defined loss functions. Specifically, establishing a logical and reciprocal connection between the training image and generator is crucial. In this context, we introduce a novel GAN loss function that employs the Sugeno complement concept to logically link the training image and generator. Our proposed loss function is a composition of logical elements, and we demonstrate through analytical analysis that it outperforms an existing loss function found in the literature. This superiority is further substantiated via comprehensive experiments, showcasing the loss function’s ability to facilitate smooth convergence during training and effectively address mode collapse issues in GANs.
AB - Interest in Generative Adversarial Networks (GANs) continues to grow, with diverse GAN variations emerging for applications across various domains. However, substantial challenges persist in advancing GANs. Effective training of deep learning models, including GANs, heavily relies on well-defined loss functions. Specifically, establishing a logical and reciprocal connection between the training image and generator is crucial. In this context, we introduce a novel GAN loss function that employs the Sugeno complement concept to logically link the training image and generator. Our proposed loss function is a composition of logical elements, and we demonstrate through analytical analysis that it outperforms an existing loss function found in the literature. This superiority is further substantiated via comprehensive experiments, showcasing the loss function’s ability to facilitate smooth convergence during training and effectively address mode collapse issues in GANs.
U2 - 10.1016/j.eswa.2024.123413
DO - 10.1016/j.eswa.2024.123413
M3 - Article
SN - 0957-4174
VL - 248
JO - Expert Systems with Applications
JF - Expert Systems with Applications
M1 - 123413
ER -