lalanikarim/comfy-mcp-server
A server using FastMCP framework to generate images based on prompts via a remote Comfy server.
[](https://smithery.ai/server/@lalanikarim/comfy-mcp-server)
> A server using FastMCP framework to generate images based on prompts via a remote Comfy server.
This script sets up a server using the FastMCP framework to generate images based on prompts using a specified workflow. It interacts with a remote Comfy server to submit prompts and retrieve generated images.
Flux-Dev-ComfyUI-Workflow.json which is only used here as reference. You will need to export from your workflow and set the environment variables accordingly.You can install the required packages for local development:
uvx mcp[cli]Set the following environment variables:
COMFY_URL to point to your Comfy server URL.COMFY_WORKFLOW_JSON_FILE to point to the absolute path of the API export json file for the comfyui workflow.PROMPT_NODE_ID to the id of the text prompt node.OUTPUT_NODE_ID to the id of the output node with the final image.OUTPUT_MODE to either url or file to select desired output.Optionally, if you have an Ollama server running, you can connect to it for prompt generation.
OLLAMA_API_BASE to the url where ollama is running.PROMPT_LLM to the name of the model hosted on ollama for prompt generation.Example:
export COMFY_URL=http://your-comfy-server-url:port
export COMFY_WORKFLOW_JSON_FILE=/path/to/the/comfyui_workflow_export.json
export PROMPT_NODE_ID=6 # use the correct node id here
export OUTPUT_NODE_ID=9 # use the correct node id here
export OUTPUT_MODE=fileComfy MCP Server can be launched by the following command:
uvx comfy-mcp-server{
"mcpSerLoading reviews...