Showing 3 of 3 projects
A C/C++ library for efficient, cross-platform LLM inference with extensive hardware support and quantization.
A lightweight, single-binary Rust inference server providing 100% OpenAI-API compatible endpoints for local GGUF models.
A library for running LLMs locally and efficiently on any device with support for Python, Flutter, and Godot.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.