Showing 33 of 33 projects
A model-definition framework for state-of-the-art machine learning models across text, vision, audio, and multimodal tasks.
A multi-backend deep learning framework that enables effortless model development across JAX, TensorFlow, PyTorch, and OpenVINO.
A Python library for accelerator-oriented array computation and program transformation, designed for high-performance numerical computing and machine learning.
A Python library for composable transformations of numerical programs: automatic differentiation, vectorization, and JIT compilation to GPU/TPU.
A Python library for composable transformations of numerical programs: automatic differentiation, vectorization, and JIT compilation to GPU/TPU.
Official JAX/Flax implementation of Vision Transformer (ViT) and MLP-Mixer for image recognition, with pre-trained models.
A research framework for fast prototyping of reinforcement learning algorithms, designed for easy experimentation and reproducibility.
A deep reinforcement learning library offering high-quality, single-file implementations of algorithms like PPO, DQN, and SAC for research and education.
A Python library for flexible and readable tensor operations across numpy, PyTorch, JAX, TensorFlow, and other frameworks.
An end-to-end deep learning library focused on clear code, speed, and research, built by Google Brain.
An end-to-end deep learning library focused on clear code and speed, used for research and production by Google Brain.
A neural network library for JAX designed for flexibility, enabling researchers to experiment with new training forms by modifying training loops.
A neural network library for JAX designed for flexibility, enabling researchers to experiment with new training forms by modifying training loops.
A JAX implementation of OpenAI's Whisper model offering up to 70x faster transcription on TPUs.
A library for probabilistic reasoning and statistical analysis integrated with TensorFlow and JAX.
A research framework for reinforcement learning providing modular building blocks and reference agent implementations.
A JAX library for rapid prototyping of large-scale attention-based vision models across images, video, audio, and multimodal data.
A comprehensive Python-first reinforcement learning framework with modular abstractions for decision intelligence applications.
A JAX-based neural network library that provides a simple, object-oriented programming model for building and training models.
A fast, differentiable physics engine built with JAX for massively parallel rigid body simulation on accelerator hardware.
A JAX library for neural networks and scientific computing with PyTorch-like syntax and full ecosystem compatibility.
A lightweight probabilistic programming library using NumPy and JAX for autograd and JIT compilation to GPU/TPU/CPU.
A JAX-native library implementing Monte Carlo tree search algorithms like AlphaZero and MuZero for reinforcement learning research.
A JAX/Flax-based framework for easy and scalable pre-training, fine-tuning, evaluation, and serving of large language models.
A high-level neural network API for specifying and analyzing infinite-width neural networks as Gaussian Processes in Python.
A high-performance, scalable LLM library and reference implementation written in pure Python/JAX for training on TPUs and GPUs.
A gradient processing and optimization library for JAX, designed for research with composable building blocks.
A gradient processing and optimization library for JAX, designed for research with composable building blocks.
A curated list of awesome libraries, projects, tutorials, and resources for the JAX machine learning ecosystem.
A Python library for constructing differentiable convex optimization layers in PyTorch, JAX, and MLX using CVXPY.
A JAX-based library providing numerical differential equation solvers for ODEs, SDEs, and CDEs with autodifferentiation and GPU support.
A JAX research toolkit for building, editing, and visualizing neural networks as legible, functional pytree data structures.
Official repository for Big Transfer (BiT) models, providing pre-trained visual representations for efficient transfer learning across computer vision tasks.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.