reallyreallyryan/hearth
Hearth is a local-first persistent AI memory system. Install it once, and every AI tool you use — Claude Desktop, LM Studio, Cursor — can store and recall memories across conversations. Your data stays on your machine in a single SQLite file.
Platform-specific configuration:
{
"mcpServers": {
"hearth": {
"command": "npx",
"args": [
"-y",
"hearth"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
Every AI you talk to forgets you. Hearth makes them remember.
Hearth is a local-first persistent AI memory system. Install it once, and every AI tool you use — Claude Desktop, LM Studio, Cursor — can store and recall memories across conversations. Your data stays on your machine in a single SQLite file.
git clone https://github.com/reallyreallyryan/hearth.git
cd hearth
pip install -e .
hearth initThat's it. You now have a working memory system. See the Install Guide for detailed step-by-step instructions.
which hearth~/Library/Application Support/Claude/claude_desktop_config.json: {
"mcpServers": {
"hearth": {
"command": "/full/path/to/hearth",
"args": ["serve"]
}
}
} Replace /full/path/to/hearth with the output from which hearth.
hearth init prints the exact config with the full path filled in — just copy and paste it.
Add to ~/.lmstudio/mcp.json:
{
"mcpServers": {
"hearth": {
"command": "/full/path/to/hearth",
"args": ["serve"]
}
}
}hearth init # Set up ~/hearth/ — database, config, pull embedding model
hearth init --no-models # Set up without Ollama (cloud-only, keyword search only)
hearth serve # Start the MCP server (stdio transport)
hearth status # Show memory count, embedding status, Ollama availability
hearth remember "some fact" # Store a memory from the command line
hearth search "query" # Search your memoriesOptions for remember: -c category (general, learning, pattern, reference, decision), -p project, -t "tag1,tag2"
Options for search: -p project, -c category, -n limit
When
Loading reviews...