Original implementation and hyperparameters for the 2014 paper "Generative Adversarial Networks" (GANs).
goodfeli/adversarial is the original code repository for the 2014 paper "Generative Adversarial Networks" (GANs). It provides the implementation and hyperparameters used to train the first GAN models, enabling researchers to reproduce the experiments from this foundational machine learning paper. The repository contains training scripts and configuration files for generating synthetic data using adversarial training.
Machine learning researchers and practitioners interested in studying the original GAN implementation, reproducing the 2014 paper results, or understanding the historical development of generative models.
This repository offers the exact code and configurations from the seminal GAN paper, providing an authentic reference implementation for academic research and historical analysis of generative adversarial networks.
Code and hyperparameters for the paper "Generative Adversarial Networks"
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Provides the exact code and hyperparameters from the seminal 2014 GAN paper, enabling precise replication of original experiments as emphasized in the README.
Includes parzen_ll.py for model evaluation using the Parzen density technique, a key method detailed in the README for assessing generative performance.
Offers separate YAML files for datasets like MNIST and CIFAR-10, simplifying training on the same data used in the original research as noted in the README.
Serves as a canonical implementation for studying the evolution of generative adversarial networks, with direct ties to the foundational paper.
Built on Theano and Pylearn2 from 2014, which are no longer maintained and may break with modern systems, as warned in the README.
The README explicitly states that no documentation or troubleshooting help is provided, making it difficult for users to resolve issues.
Requires specific GPUs like GTX-580 or hyperparameter retuning for different hardware, adding setup complexity and potential inaccuracies.
Lacks improvements in training stability, architecture, and features found in later GAN variants, limiting its utility for modern applications.