loaditout.ai
SkillsPacksTrendingLeaderboardAPI DocsBlogSubmitRequestsCompareAgentsXPrivacyDisclaimer
{}loaditout.ai
Skills & MCPPacksBlog

llmdocs

MCP Tool

vinny380/llmdocs

Docs your agents can actually search — CLI-first, self-hosted, hybrid search + MCP + llms.txt. Open source.

Install

$ npx loaditout add vinny380/llmdocs

Platform-specific configuration:

.claude/settings.json
{
  "mcpServers": {
    "llmdocs": {
      "command": "npx",
      "args": [
        "-y",
        "llmdocs"
      ]
    }
  }
}

Add the config above to .claude/settings.json under the mcpServers key.

About

<p align="center"> <h1 align="center">llmdocs</h1> <p align="center"> <strong>Documentation your agents can actually search.</strong> </p> <p align="center"> Self-hosted stack: <strong>hybrid search</strong> (semantic + keyword), <strong>MCP</strong> tools, raw <code>.md</code> URLs, and <code>llms.txt</code> — no SaaS, no paid vector API required. </p> <p align="center"> <strong>📦 <a href="https://pypi.org/project/llmdocs-mcp/">PyPI → llmdocs-mcp</a></strong> &nbsp;·&nbsp; <strong>🐳 <a href="https://hub.docker.com/r/vinny2prg/llmdocs-mcp">Docker Hub</a></strong> </p> <p align="center"> <code>pip install llmdocs-mcp</code> · Python 3.12+ · Embedded Chroma + local embeddings </p> <p align="center"> <strong>Agents:</strong> add hosted docs MCP → <a href="https://llmdocs-production.up.railway.app/mcp/"><code>llmdocs-production.up.railway.app/mcp/</code></a> · <a href="#hosted-llmdocs-mcp-for-agents">setup</a> </p> <p align="center"> <a href="#get-started">Get Started</a> &middot; <a href="#why-llmdocs">Why llmdocs?</a> &middot; <a href="#what-you-get">Features</a> &middot; <a href="#cli">CLI</a> &middot; <a href="#mcp-tools">MCP</a> &middot; <a href="#hosted-llmdocs-mcp-for-agents">Agents (hosted MCP)</a> &middot; <a href="#http-surface">HTTP</a> &middot; <a href="#docker">Docker</a> &middot; <a href="#documentation">Docs</a> </p> </p>

---

Your docs are buried in a repo. Your agent can’t find them.

Paste links into chat and hope for the best? Copy whole folders into context? That doesn’t scale — and RAG pipelines you don’t own are a second job.

llmdocs indexes your Markdown (with frontmatter), serves hybrid search over real sections, and exposes MCP so Cursor, Claude, and custom clients can search_docs, get_doc, and list_docs against *your* corpus. You run the server; embeddings stay local by default.

---

Why llmdocs?

| llmdocs | Typica

Tags

agentic-aidocs-as-codedocumentation-toolmcp

Reviews

Loading reviews...

Quality Signals

0
Installs
Last updated24 days ago
Security: AREADME

Safety

Risk Levelmedium
Data Access
read
Network Accessnone

Details

Sourcegithub-crawl
Last commit3/26/2026
View on GitHub→

Embed Badge

[![Loaditout](https://loaditout.ai/api/badge/vinny380/llmdocs)](https://loaditout.ai/skills/vinny380/llmdocs)