sandraschi/openmanus-mcp
FastMCP 3.1 Server + Webapp wrapping OpenManus (FOSS CLI, 100% local LLM when you configure Ollama / LM Studio in OpenManus). Not Manus.im.
Platform-specific configuration:
{
"mcpServers": {
"openmanus-mcp": {
"command": "npx",
"args": [
"-y",
"openmanus-mcp"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
<div align="center">
<br/>
[](./RELEASING.md) [](https://github.com/sandraschi/openmanus-mcp/actions/workflows/ci.yml) [](docs/INSTALL.md) [](https://github.com/jlowin/fastmcp) [](LICENSE) [](https://glama.ai/mcp/servers?query=openmanus)
Bridge and browser UI for [OpenManus](https://github.com/FoundationAgents/OpenManus) — the FOSS agent, not Manus.im. Use your own Ollama, LM Studio, or other local endpoints; this repo stays out of your wallet by default.
Install · Tech · Glama · How we build
</div>
---
> Beta — behavior may still shift; pin a tag for anything serious → RELEASING.md.
Webapp to start and control tasks: run OpenManus (sync or queued), pick presets and activities, chat with a local model, onboard other MCP servers into your fleet, and skim API help. Getting it running step by step: INSTALL.md.
MCP from Cursor, Claude Desktop, Glama, and similar hosts. Implemented with [FastMCP 3.1+](https://github.com/jlowin/fastmcp); tool `openmanus_bridge` checks your OpenManus install, runs prompts, and polls async jobs without leaving the editor. Client setup: INSTALL.md · transport and A
Loading reviews...