Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

AliceAria/Performance-comparison-of-GAN-on-cifar-10

Open more actions menu

Repository files navigation

Performance-comparison-of-GAN-on-cifar-10

Performance comparison of ACGAN, BEGAN, CGAN, DRAGAN, EBGAN, GAN, infoGAN, LSGAN, VAE, WGAN, WGAN_GP on cifar-10

Reference:https://github.com/hwalsuklee/tensorflow-generative-model-collections
The original code is for data MNIST, we changed the network structure to apply to cifar-10 and test Inception Score.
The net structures are almost same.
The following results can be reproduced with command:

python main.py --dataset cifar-10 --gan_type --epoch 60 --batch_size 64

#ACGAN

#BEGAN
The result is not well,we don't pay much time to to adjust the super-parameters.

#CGAN

#DRAGAN
Stable, robust, fast convergent.

#EBGAN
The net structure is the same as BEGAN, but collapse.

#GAN

#infoGAN

#LSGAN (Least Squares GAN)

#WGAN
Not as well as paper. The net structure is the same as GAN, but converges too slowly.

#WGAN_GP
There are total 300 epochs for Discriminator, but only 60 epochs for generator (The same times as other models). Converges slowly.

#VAE
Collapsed. We also try to add or subtract bn layers, but it doesn't work.

About

Performance comparison of ACGAN, BEGAN, CGAN, DRAGAN, EBGAN, GAN, infoGAN, LSGAN, VAE, WGAN, WGAN_GP on cifar-10

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

Morty Proxy This is a proxified and sanitized view of the page, visit original site.