jayluxferro/ollama-mcp
MCP server that exposes your local Ollama API as tools so agents (Cursor, Claude Desktop, etc.) can use local models for chat, completion, and embeddings.
Platform-specific configuration:
{
"mcpServers": {
"ollama-mcp": {
"command": "npx",
"args": [
"-y",
"ollama-mcp"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
Loading reviews...