An Obsidian plugin that integrates ChatGPT, OpenRouter.ai, and local LLMs for AI-powered conversations directly within your notes.
ChatGPT MD is an Obsidian plugin that integrates AI assistants directly into your note-taking environment. It allows users to chat with various AI models—including cloud services like OpenAI and OpenRouter.ai, or local LLMs via Ollama—without leaving their vault, enabling AI-powered brainstorming, research, and content generation within their existing workflow.
Obsidian users who want to incorporate AI assistance into their note-taking, knowledge management, and writing processes, particularly those valuing privacy, offline capabilities, and deep integration with their vault.
Developers choose ChatGPT MD for its seamless Obsidian integration, privacy-first design with local LLM support, and flexible multi-provider architecture that avoids vendor lock-in while keeping conversations contextual within their notes.
A (nearly) seamless integration of ChatGPT into Obsidian.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Supports OpenAI, OpenRouter.ai, and local LLMs via Ollama/LM Studio, allowing seamless switching between cloud and offline models for diverse AI needs.
Implements human-in-the-loop approval for vault search and web queries, ensuring data stays local unless explicitly shared, as highlighted in the v3.0.0 release notes.
Enables AI conversations directly within notes, with per-note configuration via frontmatter and commands accessible from Obsidian's palette for a seamless workflow.
Allows complete offline AI usage with Ollama or LM Studio, avoiding API costs and enhancing privacy, as detailed in the setup guides for local models.
Only functions within Obsidian, making it useless for users of other note-taking platforms or those wanting a standalone application outside the vault ecosystem.
Tool execution requires user approval for each step, which can disrupt automated processes and slow down productivity, despite privacy benefits.
Configuring Ollama or LM Studio involves separate installations, model downloads, and URL settings, adding overhead compared to cloud-only plugins.