paresh2806/HF-Docker-Private-Ollama-Server
Run private LLMs using Ollama, containerized with Docker and deployed on Hugging Face GPU Spaces, Supports token-based authentication using Hugging Face API tokens.Exposes secure REST APIs for model generation, pulling models, listing available models, tool calling, and keeping models alive,
Platform-specific configuration:
{
"mcpServers": {
"HF-Docker-Private-Ollama-Server": {
"command": "npx",
"args": [
"-y",
"HF-Docker-Private-Ollama-Server"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
Loading reviews...