loaditout.ai
SkillsPacksTrendingLeaderboardAPI DocsBlogSubmitRequestsCompareAgentsXPrivacyDisclaimer
{}loaditout.ai
Skills & MCPPacksBlog

sdks

MCP Tool

azex-ai/sdks

Official Azex SDKs — TypeScript, Python, MCP Server, CLI for the crypto-native LLM API gateway

Install

$ npx loaditout add azex-ai/sdks

Platform-specific configuration:

.claude/settings.json
{
  "mcpServers": {
    "sdks": {
      "command": "npx",
      "args": [
        "-y",
        "sdks"
      ]
    }
  }
}

Add the config above to .claude/settings.json under the mcpServers key.

About

Azex SDKs

Official client libraries for the Azex API — a crypto-native LLM gateway with pay-as-you-go pricing via stablecoin deposits.

Packages

| Package | Language | Install | Description | |---------|----------|---------|-------------| | `typescript/` | TypeScript/Node.js | npm install azex | Full SDK with streaming, pagination, typed responses | | `python/` | Python 3.9+ | pip install azex | Sync + async clients, httpx, pydantic v2 | | `mcp/` | TypeScript | npx @azex/mcp-server | MCP server for AI agents (Claude Desktop, Claude Code) | | `cli/` | Go | brew install azex-ai/tap/azex | Terminal client with streaming chat, logs tail, raw API |

Quick Start
TypeScript
import Azex from 'azex';

const azex = new Azex({ apiKey: 'sk_live_...' });

// OpenAI-compatible
const completion = await azex.chat.completions.create({
  model: 'openai/gpt-4o',
  messages: [{ role: 'user', content: 'Hello' }],
});

// Anthropic-compatible
const message = await azex.messages.create({
  model: 'anthropic/claude-sonnet-4-6',
  messages: [{ role: 'user', content: 'Hello' }],
  max_tokens: 1024,
});

// Streaming
const stream = await azex.chat.completions.create({
  model: 'deepseek/deepseek-chat',
  messages: [{ role: 'user', content: 'Hello' }],
  stream: true,
});
for await (const chunk of stream) {
  process.stdout.write(chunk.choices[0]?.delta?.content ?? '');
}
Python
from azex import Azex

client = Azex(api_key="sk_live_...")

# OpenAI-compatible
response = client.chat.completions.create(
    model="openai/gpt-4o",
    messages=[{"role": "user", "content": "Hello"}],
)

# Streaming
with client.chat.completions.stream(
    model="deepseek/deepseek-chat",
    messages=[{"role": "user", "content": "Hello"}],
) as stream:
    for chunk in stream:
        print(chunk.choices[0].delta.content, end="")
MCP Server
# Claude Code
claude mcp add azex -- 

Tags

aianthropicapi-gatewayclicryptollmmcpopenaipythonsdkstablecointypescript

Reviews

Loading reviews...

Quality Signals

0
Installs
Last updated18 days ago
Security: AREADME

Safety

Risk Levelmedium
Data Access
read
Network Accessnone

Details

Sourcegithub-crawl
Last commit3/26/2026
View on GitHub→

Embed Badge

[![Loaditout](https://loaditout.ai/api/badge/azex-ai/sdks)](https://loaditout.ai/skills/azex-ai/sdks)