A local-first knowledge base that enables AI assistants to persistently read and write structured notes using the Model Context Protocol.
Basic Memory is an open-source tool that enables Large Language Models (LLMs) like Claude to build and access a persistent, structured knowledge base through natural conversation. It stores all data as Markdown files on your local machine, creating a bi-directional flow where both humans and AI can contribute to a growing knowledge graph. It solves the problem of ephemeral LLM interactions by providing a local-first, structured memory that persists across conversations.
Developers and knowledge workers who use LLM assistants like Claude Desktop and want to build a persistent, local knowledge base that integrates seamlessly with their conversational workflow. It's also suitable for users of note-taking tools like Obsidian who want AI collaboration.
Developers choose Basic Memory for its local-first storage using simple Markdown files, giving full control and privacy, and its bi-directional integration via the Model Context Protocol (MCP), allowing LLMs to both read from and write to the knowledge base. Its unique selling point is creating a traversable semantic network from Markdown notes that both humans and AI can edit.
AI conversations that actually remember. Never re-explain your project to your AI again. Join our Discord: https://discord.gg/tyvKNccgqN
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Stores all data as Markdown files on your local machine, ensuring full ownership and privacy without mandatory cloud dependencies, as highlighted in the Local-First Storage feature.
Uses the Model Context Protocol (MCP) to enable bi-directional flow with LLMs like Claude, allowing natural conversation to read and write to the knowledge base without special prompting.
Automatically extracts entities, observations, and relations from Markdown to create a traversable semantic network, demonstrated in the coffee brewing example with linked topics.
Combines full-text and semantic vector search using FastEmbed embeddings, providing accurate retrieval by both keywords and meaning, as noted in the v0.19.0 update.
Limited to LLMs and tools that support the Model Context Protocol, excluding popular AI assistants like OpenAI's ChatGPT that lack native MCP integration.
Requires non-trivial setup steps, such as editing Claude Desktop config files or VS Code settings, which can be a barrier for users unfamiliar with MCP or CLI tools.
Optional cloud synchronization and advanced features like per-project routing require a subscription, creating a vendor dependency despite the open-source local core.