BerriAI/litellm
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]
Platform-specific configuration:
{
"mcpServers": {
"litellm": {
"command": "npx",
"args": [
"-y",
"litellm"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
Loading reviews...