Sethuram2003/MCP-ollama_server
Extends Model Context Protocol (MCP) to local LLMs via Ollama, enabling Claude-like tool use (files, web, email, GitHub, AI images) while keeping data private. Modular Python servers for on-prem AI. #LocalAI #MCP #Ollama
Platform-specific configuration:
{
"mcpServers": {
"MCP-ollama_server": {
"command": "npx",
"args": [
"-y",
"MCP-ollama_server"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
Loading reviews...