loaditout.ai
SkillsPacksTrendingLeaderboardAPI DocsBlogSubmitRequestsCompareAgentsXPrivacyDisclaimer
{}loaditout.ai
Skills & MCPPacksBlog

ollama-api-compatible-server

MCP Tool

beekmarks/ollama-api-compatible-server

This project provides an Ollama API-compatible server that uses the `llama-cpp-python` library to run local LLM inference. It allows you to use your own GGUF models with an API that's compatible with Ollama's endpoints, making it easy to integrate with existing tools and applications designed to work with Ollama.

Install

$ npx loaditout add beekmarks/ollama-api-compatible-server

Platform-specific configuration:

.claude/settings.json
{
  "mcpServers": {
    "ollama-api-compatible-server": {
      "command": "npx",
      "args": [
        "-y",
        "ollama-api-compatible-server"
      ]
    }
  }
}

Add the config above to .claude/settings.json under the mcpServers key.

Reviews

Loading reviews...

Quality Signals

1
Stars
0
Installs
Last updated399 days ago
Security: C

Safety

Risk Levelmedium
Data Access
read
Network Accessnone

Details

Sourcegithub-crawl
Last commit3/18/2025
View on GitHub→

Embed Badge

[![Loaditout](https://loaditout.ai/api/badge/beekmarks/ollama-api-compatible-server)](https://loaditout.ai/skills/beekmarks/ollama-api-compatible-server)