m4k00/mcp-knowledge-base
Give your AI assistant a long-term memory. An MCP server that turns a local directory of markdown files into a semantic knowledge base for Claude Desktop and other MCP clients.
Platform-specific configuration:
{
"mcpServers": {
"mcp-knowledge-base": {
"command": "npx",
"args": [
"-y",
"mcp-knowledge-base"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
> Give your AI assistant a long-term memory. Drop markdown files into a folder — Claude instantly knows your notes, docs, and code snippets.
MCP Knowledge Base Server is a personal knowledge base that connects directly to AI assistants like Claude Desktop and Cursor. It turns a local folder of markdown files into a queryable, semantically searchable memory layer — without any cloud infrastructure or external databases.
It works over the Model Context Protocol (MCP), a standard that lets AI assistants call tools and retrieve data from local servers. When you ask Claude *"what do my notes say about deployment?"*, it uses this server to run a semantic similarity search over your embedded documents and return the most relevant chunks — all locally, in milliseconds.
Under the hood, it uses OpenAI's text-embedding-3-small model to generate vector embeddings for your markdown content, stores them in a local SQLite database, and performs cosine similarity search to find the most relevant material. The server auto-indexes on startup (only re-embedding changed files), so your knowledge base stays fresh with zero manual work.
mcp-kb index, mcp-kb stats, mcp-kb serveLoading reviews...