CS Events

Qualifying Exam

Learning Robust and Unbiased Conditional Generative Adversarial Networks


Download as iCal file

Thursday, May 07, 2020, 12:30pm - 01:30pm


Speaker: Ligong Han

Location : Remote via Webex


Prof. Dimitri Metaxas

Prof. Sungjin Ahn

Prof. Yongfeng Zhang

Prof. Zheng Zhang

Event Type: Qualifying Exam

Abstract: Conditional Generative Adversarial Networks (CGANs) are widely used generative models and have shown exceptional generation performance over the past few years. Class conditional information can be incorporated by either (1) conditioning the discriminator directly on labels, or by (2) incorporating an auxiliary classification loss. We notice two limitations in current conditional GAN models: (1) they require large numbers of annotations, and (2) the widely adoptedAuxiliary Classifier GANs (ACGANs) learns a biased distribution.To address the first problem, we propose a novel generative adversarial network utilizing weak supervision in the form of pairwise comparisons (PC-GAN) for image attribute editing. In the light of Bayesian uncertainty estimation and noise-tolerant adversarial training, PC-GAN can estimate attribute rating efficiently and demonstrate robust performance in noise resistance. Through extensive experiments, we show both qualitatively and quantitatively that PC-GAN performs comparably with fully-supervised methods and outperforms unsupervised baselines.Previous attempts have been made to remedy the second problem, such as Twin Auxiliary Classifier GAN (TAC-GAN) which introduces a twin classifier to the min-max game. However, it has been reported that using a twin auxiliary classifier may cause instability in training. To this end, we propose an unbiased Auxiliary GAN (AC-GAN-MINE) that utilizes the Mutual Information Neural Estimator (MINE) to estimate the mutual information between the generated data distribution andlabels. Experimental results on three datasets, including Mixture of Gaussian (MoG), MNIST and CIFAR10 datasets, show that our AC-GAN-MINE performs better than AC-GAN and TAC-GAN.