A CLI and companion app that spins up a complete, pre-wired local LLM stack with hundreds of services using a single command.
Harbor is a CLI and companion application that lets you spin up a complete local LLM stack—including backends like Ollama and llama.cpp, frontends like Open WebUI, and supporting services for search, voice chat, and image generation—with a single `harbor up` command. It handles Docker Compose orchestration, configuration, and cross-service connectivity, enabling users to focus on using models rather than manual setup.
Developers and researchers experimenting with local LLMs who want a pre-configured, integrated stack without manually managing Docker containers and service connections. It's also suitable for those who need to quickly prototype or test different LLM frontends, backends, and satellite services together.
Developers choose Harbor for its one-command setup of a fully integrated local LLM ecosystem with pre-wired connectivity between services, eliminating complex manual configuration. Its extensive catalog of hundreds of services and ability to export setups to standalone Docker Compose files provide both convenience and flexibility.
One command brings a complete pre-wired LLM stack with hundreds of services to explore.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
The README highlights that you can launch a fully configured LLM stack with just `harbor up`, eliminating manual configuration and enabling instant experimentation.
Services like SearXNG are automatically connected to Open WebUI for Web RAG, and Speaches provides TTS/STT out of the box, saving significant setup time.
It includes hundreds of services, from backends like vLLM and llama.cpp to frontends and tools like ComfyUI and Dify, offering a broad ecosystem in one place.
Users can save profiles with `harbor profile save` and export to standalone Docker Compose via `harbor eject`, balancing convenience with future independence.
Harbor relies entirely on Docker Compose, so it fails in environments without Docker or with compatibility issues, and adds overhead for lightweight setups.
The README explicitly warns that using `harbor tunnel` to expose services is 'not safe,' indicating potential vulnerabilities if not carefully managed.
It's designed as a 'helper for local LLM development environments,' lacking built-in features for production deployment, monitoring, or scaling.