Showing 21 of 21 projects
A platform to run, manage, and serve open-source large language models (LLMs) locally or on your own infrastructure.
A platform to run, manage, and serve open-source large language models locally with a simple CLI and REST API.
A self-hosted, extensible AI platform with a user-friendly web interface supporting Ollama, OpenAI API, and offline RAG.
A cross-platform desktop client for ChatGPT, Claude, and other LLMs with local data storage and a powerful prompt library.
A privacy-focused AI answering engine that runs on your own hardware, combining web search with local and cloud LLMs.
An open-source AI chat interface that connects to various AI models, including OpenAI, Azure, and local models via Ollama.
An open-source, AI-integrated cross-platform terminal with durable SSH sessions, built-in file editing, and flexible workspace organization.
A command-line AI assistant that generates shell commands, code snippets, and documentation using large language models.
A command-line interface tool that integrates multiple LLM providers, offering shell assistance, interactive chat, RAG, AI tools, and local server capabilities.
The Go implementation of LangChain, enabling developers to build LLM-powered applications through composable components.
A Neovim plugin that integrates LLMs and AI agents for coding assistance, chat, and inline transformations.
A private, automatic macOS app that watches your screen and uses AI to generate a timeline of your daily activities.
Automatically generates table-driven Go test boilerplate from source code, with optional AI-powered test case generation.
An AI-powered research assistant that performs deep, agentic research using multiple LLMs and search engines with full local deployment and encryption.
A CLI tool and library that uses LLMs to generate Infrastructure-as-Code templates, configuration files, and utilities from natural language prompts.
A self-contained, pure-Go web server with built-in Lua scripting, multiple template engines, database backends, and support for HTTP/2, QUIC, and AI/LLM integration.
An AI-powered tool that automatically generates detailed and customizable README files from any code repository.
A CLI and companion app that spins up a complete, pre-wired local LLM stack with hundreds of services using a single command.
A Ruby library for building LLM-powered applications with a unified interface for multiple providers, RAG systems, and AI assistants.
A Neovim plugin for generating and editing text using local LLMs like Llama and Mistral via Ollama.
A desktop AI assistant and universal MCP client that works with any LLM provider, offering chat, image/video generation, and system-wide productivity tools.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.