xiaolai/codex-octopus
One brain, many arms — spawn multiple specialized Codex agents as MCP servers
Platform-specific configuration:
{
"mcpServers": {
"codex-octopus": {
"command": "npx",
"args": [
"-y",
"codex-octopus"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
<p align="center"> </p>
One brain, many arms.
An MCP server that wraps the OpenAI Codex SDK, letting you run multiple specialized Codex agents — each with its own model, sandbox, effort, and personality — from any MCP client.
Codex is powerful. But one instance does everything the same way. Sometimes you want a strict code reviewer in read-only sandbox. A test writer with workspace-write access. A cheap quick helper on minimal effort. A deep thinker on xhigh.
Codex Octopus lets you spin up as many of these as you need. Same binary, different configurations. Each one shows up as a separate tool in your MCP client.
@openai/codex)CODEX_API_KEY env var) or inherited from parent processnpm install codex-octopusOr use npx directly in your .mcp.json (see Quick Start below).
Add to your .mcp.json:
{
"mcpServers": {
"codex": {
"command": "npx",
"args": ["codex-octopus@latest"],
"env": {
"CODEX_SANDBOX_MODE": "workspace-write",
"CODEX_APPROVAL_POLICY": "never"
}
}
}
}This gives you two tools: codex and codex_reply. That's it — you have Codex as a tool.
The real power is running several instances with different configurations:
{
"mcpServers": {
"code-reviewer": {
"command": "npx",
"args": ["codex-octopus@latest"],
"env": {
"CODEX_TOOL_NAME": "code_reviewer",
"CODEX_SERVER_NAME": "code-reviewer",
"CODEX_DESCRIPTION": "Strict code reviewer. Read-onlLoading reviews...