enescingoz/colab-llm
This repository provides a ready-to-use Google Colab notebook that turns Colab into a temporary server for running local LLM models using Ollama. It exposes the model API via a secure Cloudflare tunnel, allowing remote access from tools like curl or ROO Code in VS Code — no server setup or cloud deployment required.
Platform-specific configuration:
{
"mcpServers": {
"colab-llm": {
"command": "npx",
"args": [
"-y",
"colab-llm"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
Loading reviews...