A lightweight MCP server providing a unified interface to multiple LLM providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama.
Just Prompt is a lightweight Model Control Protocol (MCP) server that provides a unified interface to multiple large language model providers. It allows developers to interact with OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama models through a single API, eliminating the need to learn and manage different provider-specific SDKs and interfaces.
Developers and AI engineers who work with multiple LLM providers and want a consistent interface for testing, comparing, and integrating different AI models into their applications.
Just Prompt simplifies AI development by abstracting provider complexities, enabling parallel model execution, and offering unique tools like the CEO & Board feature for collaborative AI decision-making—all through a lightweight, self-hostable MCP server.
just-prompt is an MCP server that provides a unified interface to top LLM providers (OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama)
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Abstracts away differences between OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama, providing a single, consistent interface as highlighted in the features list.
Enables running multiple LLM models simultaneously from a single prompt, allowing efficient comparison and testing across providers without sequential calls.
Includes unique features like the CEO & Board tool for collaborative decision-making and supports provider-specific controls such as OpenAI reasoning effort, Claude thinking tokens, and Gemini thinking budgets.
Offers tools like prompt_from_file and prompt_from_file_to_file to handle prompts from text files and save responses as markdown, streamlining batch processing and documentation.
Requires managing environment variables for multiple API keys and configuring the MCP server, which can be cumbersome and error-prone, as shown in the installation instructions.
Built on the Model Control Protocol (MCP), which is still developing, risking compatibility issues or breaking changes that could affect stability.
May not support all advanced features of individual provider SDKs immediately, as acknowledged in the fallback mechanisms for OpenAI reasoning effort when using older SDKs.
The abstraction layer and MCP server setup can introduce latency compared to direct API calls, making it less ideal for time-sensitive applications.