A high-performance neural network training interface for TensorFlow focused on speed, flexibility, and reproducible research.
Tensorpack is a neural network training interface built on TensorFlow that focuses on maximizing training speed and flexibility for deep learning research. It provides efficient data loading, scalable multi-GPU training, and high-quality reproducible implementations of research papers. The framework avoids being a restrictive model wrapper, allowing integration with various TensorFlow symbolic function libraries.
Deep learning researchers and practitioners who need fast, flexible training pipelines for computer vision, NLP, or reinforcement learning projects. It's particularly valuable for those reproducing papers or building production-ready models.
Developers choose Tensorpack for its significantly faster training speeds compared to alternatives like Keras, its research-focused flexibility with high-performance data loading, and its collection of reproducible paper implementations. It provides production-ready scalability without sacrificing experimental freedom.
A Neural Net Training Interface on TensorFlow, with focus on speed + flexibility
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Benchmarks show Tensorpack runs CNN training 1.2–5x faster than equivalent Keras code by using TensorFlow efficiently with no extra overhead, as highlighted in the README's performance comparisons.
The tensorpack.dataflow system uses pure Python with autoparallelization to maximize flexibility and speed for research, addressing limitations of symbolic programming like tf.data.
It provides high-quality, faithful reproductions of well-known papers in vision, NLP, and reinforcement learning, refusing toy examples in favor of actual research use.
Tensorpack is not a model wrapper, allowing integration with any TensorFlow symbolic functions like tf.layers, Keras, or slim, maintaining flexibility without locking users in.
The README notes that Tensorpack uses TF1 compatibility mode for TF2, and some examples are not yet migrated, which can hinder adoption in modern TF2-native projects.
As admitted in the install section, Tensorpack is not yet stable, requiring users to mark exact versions as dependencies to avoid breaking changes in production.
Its focus on research flexibility and lack of pre-styled components means developers must invest more time in setup and customization compared to simpler APIs like Keras.