mentu-ai/metamcp
MetaMCP collapses N child MCP servers into 4 tools
Platform-specific configuration:
{
"mcpServers": {
"metamcp": {
"command": "npx",
"args": [
"-y",
"metamcp"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
[](https://www.npmjs.com/package/@mentu/metamcp) [](https://nodejs.org) [](LICENSE) [](https://www.typescriptlang.org) [](https://github.com/mentu-ai/metamcp/actions/workflows/ci.yml)
MetaMCP connects all your MCP servers through one. Your model sees 6 tools instead of hundreds.
Think of it like a power strip for MCP servers. Plug in as many as you need -- playwright, databases, GitHub, custom tools -- and your LLM talks to one server that handles everything behind the scenes.
┌─── playwright (52 tools)
│
LLM ──► MetaMCP ────────┼─── fetch (3 tools)
(6 tools) │
├─── sqlite (6 tools)
│
└─── ... N more serversEvery MCP server you add registers its tool schemas with the LLM. Each schema eats context tokens. At 5 servers with 20 tools each, that's ~15,000 tokens spent on schemas alone -- every single request.
MetaMCP collapses all of that into 6 tools (~1,300 tokens). That cost stays constant whether you run 3 servers or 30. Less token overhead, better tool selection accuracy, more room for actual work.
Beyond token savings, MetaMCP handles the things you shouldn't have to think about: connection pooling, process lifecycle, error recovery, schema caching, and transport differences between local and remote servers.
Install and run:
npx @mentu/metamcp # run directly (no install)
npm install -g @mentu/metamcp # or install globallyAuto-configure your editor (Claude Desktop, Claude Code, Cursor, VS Code, Windsurf, and m
Loading reviews...