loaditout.ai
SkillsPacksTrendingLeaderboardAPI DocsBlogSubmitRequestsCompareAgentsXPrivacyDisclaimer
{}loaditout.ai
Skills & MCPPacksBlog

mcp-llm-gateway

MCP Tool

daedalus/mcp-llm-gateway

MCP-compatible LLM gateway that proxies completion requests to downstream OpenAI-compatible providers.

Install

$ npx loaditout add daedalus/mcp-llm-gateway

Platform-specific configuration:

.claude/settings.json
{
  "mcpServers": {
    "mcp-llm-gateway": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-llm-gateway"
      ]
    }
  }
}

Add the config above to .claude/settings.json under the mcpServers key.

About

MCP LLM Gateway

> MCP-compatible LLM gateway that proxies completion requests to downstream OpenAI-compatible providers.

[](https://pypi.org/project/mcp-llm-gateway/) [](https://pypi.org/project/mcp-llm-gateway/) [](https://github.com/astral-sh/ruff)

mcp-name: io.github.daedalus/mcp-llm-gateway

Install
pip install mcp-llm-gateway
Usage
Configuration

Set the following environment variables:

  • DOWNSTREAM_URL: Base URL for the OpenAI-compatible downstream API (required)
  • DEFAULT_MODEL: Default model to use for completions (required)
  • MODEL_LIST_URL: URL to fetch available models from (optional, defaults to models.dev)
  • API_KEY: Optional API key for downstream (passthrough)
  • TIMEOUT: Request timeout in seconds (optional, default: 60)
MCP Server

Run the MCP server with stdio transport:

mcp-llm-gateway
MCP Tools

The server exposes the following tools:

  • list_models(): List all available models from the remote endpoint
  • complete(prompt, model, max_tokens, temperature): Send a completion request to the downstream LLM provider
MCP Resources
  • models://list: Returns the list of available models
  • config://info: Returns current gateway configuration
Development
git clone https://github.com/daedalus/mcp-llm-gateway.git
cd mcp-llm-gateway
pip install -e ".[test]"

# run tests
pytest

# format
ruff format src/ tests/

# lint
ruff check src/ tests/

# type check
mypy src/
API
core.models
  • Model: Dataclass representing an available LLM model
  • CompletionRequest: Dataclass for completion request payloads
  • GatewayConfig: Dataclass for gateway configuration
adapters.http
  • HTTPAdapter: HTTP client for downstream AP

Tags

llmmcpopenaiopenai-api

Reviews

Loading reviews...

Quality Signals

0
Installs
Last updated13 days ago
Security: AMCP ConfigREADME
New

Safety

Risk Levelmedium
Data Access
read
Network Accessnone

Details

Sourcegithub-crawl
Last commit4/2/2026
View on GitHub→

Embed Badge

[![Loaditout](https://loaditout.ai/api/badge/daedalus/mcp-llm-gateway)](https://loaditout.ai/skills/daedalus/mcp-llm-gateway)