A Python library built on JAX for studying many-body quantum systems using neural networks and machine learning.
NetKet is an open-source Python library built on JAX that delivers cutting-edge methods for studying many-body quantum systems using artificial neural networks and machine learning techniques. It provides a comprehensive toolkit to represent quantum wavefunctions with neural networks and estimate quantum observables through variational Monte Carlo. The project bridges quantum physics and machine learning by offering efficient, scalable algorithms to make complex quantum systems more computationally tractable.
Researchers and practitioners in quantum physics and computational science who need to simulate and analyze many-body quantum systems with advanced neural network techniques. This includes academics, scientists, and engineers working on quantum mechanics, condensed matter physics, or quantum computing who require high-performance tools for variational methods.
Developers choose NetKet for its integration of modern machine learning frameworks like JAX, enabling automatic differentiation and GPU acceleration for high-performance computing on quantum problems. Its unique selling point is providing a specialized, scalable toolkit that combines neural quantum states with variational Monte Carlo methods, which is not commonly found in general-purpose quantum simulation libraries.
Machine learning algorithms for many-body quantum systems
Built on JAX for automatic differentiation and GPU acceleration, enabling high-speed computations on quantum systems, as highlighted in the installation guide for CUDA support on Linux.
Provides cutting-edge methods like neural quantum states and variational Monte Carlo, specifically tailored for many-body quantum systems, bridging quantum physics and machine learning effectively.
Offers extensive tutorials, examples, and guides, including Colaboratory tutorials, to help users quickly get started with practical applications.
Supported by a Slack channel and GitHub discussions, facilitating collaboration and timely support among researchers and practitioners.
GPU acceleration is only available on Linux, restricting performance on Windows and MacOS, as explicitly noted in the installation instructions, which limits accessibility for some users.
Heavily relies on JAX, which has known issues with conda installations and can be complex to set up, especially for those unfamiliar with the ecosystem, adding to the learning curve.
Primarily designed for many-body quantum systems with neural networks, making it less versatile for other quantum simulation approaches like quantum chemistry or general machine learning tasks.
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Trax — Deep Learning with Clear Code and Speed
Flax is a neural network library for JAX that is designed for flexibility.
Flax is a neural network library for JAX that is designed for flexibility.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.