Showing 11 of 11 projects
Official inference framework for 1-bit LLMs, enabling fast and lossless CPU/GPU inference with significant speed and energy efficiency gains.
Industrial-strength Natural Language Processing library for Python, featuring pretrained pipelines for 70+ languages and production-ready training.
A Python framework for computing and training state-of-the-art text embeddings, rerankers, and sparse encoders.
A comprehensive library for post-training foundation models using reinforcement learning and fine-tuning techniques.
A PyTorch library providing 12+ semantic segmentation model architectures with 800+ pretrained convolutional and transformer-based encoders.
A state-of-the-art Natural Language Processing library built on Apache Spark, offering 100,000+ pretrained models and pipelines in 200+ languages.
A Python library offering scalable and user-friendly implementations of state-of-the-art neural forecasting models.
A Rust-native port of Hugging Face Transformers providing ready-to-use NLP pipelines and transformer models like BERT, GPT2, and T5.
A modular toolkit for machine learning, natural language processing, and text generation with TensorFlow and PyTorch versions.
A high-performance, scalable LLM library and reference implementation written in pure Python/JAX for training on TPUs and GPUs.
A JAX research toolkit for building, editing, and visualizing neural networks as legible, functional pytree data structures.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.