A local CLI copilot that suggests and explains shell commands using open-source models via Ollama, with no API keys or internet required.
tlm is a local command-line interface copilot that uses open-source AI models via Ollama to assist developers with shell commands. It suggests commands based on natural language prompts, explains existing commands, and answers questions with optional file context—all running entirely on the user's machine without requiring internet or API keys.
Developers and system administrators who work extensively in the terminal and want AI-powered command assistance without relying on cloud services or compromising privacy.
tlm offers a private, offline-first alternative to cloud-based CLI copilots by leveraging locally run open-source models, eliminating subscription costs and data privacy concerns while providing customizable model choices and shell integration.
Local CLI Copilot, powered by Ollama. 💻🦙
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Runs entirely offline with no API keys or internet connection, ensuring user data never leaves the local machine, as emphasized in the README's no-subscription philosophy.
Supports various open-source models like Llama 3.3 and Qwen via Ollama, allowing users to experiment and choose based on performance needs, detailed in the features section.
Includes a beta ask command that uses Retrieval-Augmented Generation with local file context, enabling project-specific questions without cloud dependencies.
Works on macOS, Linux, and Windows with automatic shell detection for Bash, Zsh, and PowerShell, making it accessible across diverse development setups.
Requires separate installation of Ollama and model downloads, adding multiple steps beyond tlm itself, which can be cumbersome for users wanting plug-and-play solutions.
Running large language models locally consumes significant CPU/GPU and memory, potentially slowing down other applications on the machine.
The RAG-based ask command is marked as beta, indicating it may be less stable or feature-complete compared to core functions like suggest and explain.
tlm is an open-source alternative to the following products:
Claude CLI tools are command-line utilities for interacting with Anthropic's Claude AI models, enabling text generation and processing via terminal commands.
ChatGPT CLI tools are command-line interface applications that allow users to interact with OpenAI's ChatGPT models directly from their terminal or shell environment.
GitHub Copilot CLI is a command-line interface version of GitHub Copilot that provides AI-assisted code suggestions and completions directly in the terminal.