loaditout.ai
SkillsPacksTrendingLeaderboardAPI DocsBlogSubmitRequestsCompareAgentsXPrivacyDisclaimer
{}loaditout.ai
Skills & MCPPacksBlog

hearth

MCP Tool

reallyreallyryan/hearth

Hearth is a local-first persistent AI memory system. Install it once, and every AI tool you use — Claude Desktop, LM Studio, Cursor — can store and recall memories across conversations. Your data stays on your machine in a single SQLite file.

Install

$ npx loaditout add reallyreallyryan/hearth

Platform-specific configuration:

.claude/settings.json
{
  "mcpServers": {
    "hearth": {
      "command": "npx",
      "args": [
        "-y",
        "hearth"
      ]
    }
  }
}

Add the config above to .claude/settings.json under the mcpServers key.

About

Hearth

Every AI you talk to forgets you. Hearth makes them remember.

Hearth is a local-first persistent AI memory system. Install it once, and every AI tool you use — Claude Desktop, LM Studio, Cursor — can store and recall memories across conversations. Your data stays on your machine in a single SQLite file.

Quick Start
git clone https://github.com/reallyreallyryan/hearth.git
cd hearth
pip install -e .
hearth init

That's it. You now have a working memory system. See the Install Guide for detailed step-by-step instructions.

Connect to Claude Desktop
  1. Find your hearth path:
   which hearth
  1. Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
   {
     "mcpServers": {
       "hearth": {
         "command": "/full/path/to/hearth",
         "args": ["serve"]
       }
     }
   }

Replace /full/path/to/hearth with the output from which hearth.

  1. Quit and reopen Claude Desktop (Cmd+Q, then relaunch).

hearth init prints the exact config with the full path filled in — just copy and paste it.

Connect to LM Studio

Add to ~/.lmstudio/mcp.json:

{
  "mcpServers": {
    "hearth": {
      "command": "/full/path/to/hearth",
      "args": ["serve"]
    }
  }
}
CLI Commands
hearth init                    # Set up ~/hearth/ — database, config, pull embedding model
hearth init --no-models        # Set up without Ollama (cloud-only, keyword search only)
hearth serve                   # Start the MCP server (stdio transport)
hearth status                  # Show memory count, embedding status, Ollama availability
hearth remember "some fact"    # Store a memory from the command line
hearth search "query"          # Search your memories

Options for remember: -c category (general, learning, pattern, reference, decision), -p project, -t "tag1,tag2"

Options for search: -p project, -c category, -n limit

MCP Tools

When

Tags

ai-memoryai-toolsclaude-desktopknowledge-baselm-studiolocal-firstmcpmcp-servermodel-context-protocolollamapythonsemantic-searchsqlitevector-search

Reviews

Loading reviews...

Quality Signals

0
Installs
Last updated25 days ago
Security: AREADME

Safety

Risk Levelmedium
Data Access
read
Network Accessnone

Details

Sourcegithub-crawl
Last commit3/23/2026
View on GitHub→

Embed Badge

[![Loaditout](https://loaditout.ai/api/badge/reallyreallyryan/hearth)](https://loaditout.ai/skills/reallyreallyryan/hearth)