tymorton/surreal-mem-mcp
A standalone Edge-RAG Memory Daemon for autonomous AI agents. High-performance Rust core (SurrealDB/RocksDB + Axum) with Bayesian Math Retrieval, multi-language AST indexing, and enterprise Semantic Redaction.
Platform-specific configuration:
{
"mcpServers": {
"surreal-mem-mcp": {
"command": "npx",
"args": [
"-y",
"surreal-mem-mcp"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
A standalone Edge-RAG Memory Daemon utilizing HTTP/SSE MCP transport. The architecture consists of a high-performance Rust core (SurrealDB/RocksDB + Axum server) and a zero-config Python CLI Bootstrapper. It bypasses heavy LLM-as-a-Judge rerankers in favor of an ultra-fast sub-millisecond Bayesian Math Outer Query combining Cosine Similarity and BM25 Posterior scores.
This Model Context Protocol (MCP) server allows your AI agents (Claude Desktop, Cursor, Gemini CLI, OpenCode, Code Puppy, etc.) to use a shared global memory system, complete with standardized behavioral rules (SOUL.md and MEMORY.md), independent of your current project directory.
Ollama, LM Studio) natively.vector::similarity::cosine() * 0.7 and BM25 * 0.3, weighted by an epistemic prior (time decay, graph density, access counts). Let math do the parsing, not latency-heavy LLMs.~/.surreal-mem-mcp/rules/ accessible universally via resources/list. Memory rules dynamically persist across completely disparate agent ecosystems.> ๐ก Curious about the technical tradeoffs? Read [ARCHITECTURE.md](./ARCHITECTURE.md) to see how our Epistemic Math Queries drastically drop LLM token consumption, and why our embedded Graph Database outperforms traditional SQLite for dense code reasoning.
We recommend u
Loading reviews...