ped-ro/memex
A personal knowledge engine — local semantic search, MCP server, and sync stack for your second brain. No cloud, no subscriptions.
Platform-specific configuration:
{
"mcpServers": {
"memex": {
"command": "npx",
"args": [
"-y",
"memex"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
A personal knowledge engine. Semantic search, MCP server, and sync stack for your second brain — exposed to Claude, ChatGPT, and any HTTP client.
No cloud. No subscriptions. Everything runs on your machine.
search_vault, get_note, get_backlinks, related_notes, search_by_tag, recent_notes, vault_statsObsidian Vault (local)
↓ (mount)
sync daemon → embeddings service (mxbai-embed-large-v1, local)
↓
pgvector (PostgreSQL 17 + pgvector extension)
↓
MCP server (Node.js, port 3456)
├── Claude Desktop / Claude Code (MCP transport)
├── ChatGPT connector (REST via Cloudflare Tunnel)
└── Any HTTP clientgit clone https://github.com/ped-ro/memex
cd memex
# 1. Configure
cp .env.example .env
# Edit .env — set VAULT_PATH, PGPASSWORD, MCP_API_KEY
# 2. Build and start
docker compose up -d
# 3. Run initial import (first time only)
docker exec vault-sync python /app/host/sync.py --vault /vault
# 4. Verify
curl http://localhost:3456/healthFirst startup takes a few minutes — the embeddings service downloads the model (~1.3GB).
All config lives in .env. Copy .env.example to get started:
| Variable | Required | Default | Description | |---|---|---|---| | VAULT_PATH | ✅ | — | Absolute path to your notes directory |
Loading reviews...