open-webui/mcpo
A simple, secure MCP-to-OpenAPI proxy server
Platform-specific configuration:
{
"mcpServers": {
"mcpo": {
"command": "npx",
"args": [
"-y",
"mcpo"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
Expose any MCP tool as an OpenAPI-compatible HTTP server—instantly.
mcpo is a dead-simple proxy that takes an MCP server command and makes it accessible via standard RESTful OpenAPI, so your tools "just work" with LLM agents and apps expecting OpenAPI servers.
No custom protocol. No glue code. No hassle.
MCP servers usually speak over raw stdio, which is:
mcpo solves all of that—without extra effort:
What feels like "one more step" is really fewer steps with better outcomes.
mcpo makes your AI tools usable, secure, and interoperable—right now, with zero hassle.
We recommend using uv for lightning-fast startup and zero config.
uvx mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_commandOr, if you’re using Python:
pip install mcpo
mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_commandTo use an SSE-compatible MCP server, simply specify the server type and endpoint:
mcpo --port 8000 --api-key "top-secret" --server-type "sse" -- http://127.0.0.1:8001/sseYou can also provide headers for the SSE connection:
mcpo --port 8000 --api-key "top-secret" --server-type "sse" --header '{"Authorization": "Bearer token", "X-Custom-Header": "value"}' -- http://127.0.0.1:8001/sseTo use a Streamable HTTP-compatible MCP server, specify the server type and endpoint:
mcpo --port 8000 --api-key "top-secret" --server-type "streamable-http" -- http://127.0.0.1:8002/mcpYou can also run mcpo via Docker with no installation:
docker ruLoading reviews...