loaditout.ai
SkillsPacksTrendingLeaderboardAPI DocsBlogSubmitRequestsCompareAgentsXPrivacyDisclaimer
{}loaditout.ai
Skills & MCPPacksBlog

local_vision

MCP Tool

DavidEasden/local_vision

An MCP that uses local VLM to convert images into descriptive text for large models without visual capabilities

Install

$ npx loaditout add DavidEasden/local_vision

Platform-specific configuration:

.claude/settings.json
{
  "mcpServers": {
    "local_vision": {
      "command": "npx",
      "args": [
        "-y",
        "local_vision"
      ]
    }
  }
}

Add the config above to .claude/settings.json under the mcpServers key.

About

Local Vision MCP

A native visual analysis tool based on MCP (Model Context Protocol) that uses LM Studio's vision model to analyze images.

Functional features
  • šŸ” Local Image Analysis: Supports analyzing images in local PNG, JPG, JPEG, WEBP formats
  • šŸ¤– Visual Model Integration: Integrate LM Studio's vision models for image understanding
  • šŸ›”ļø Security Restrictions: Only allow access to locally existing image files to prevent path traversal attacks
  • ⚔ Fast Response: Asynchronous processing with support for custom analysis prompts
System Requirements
  • Python 3.8+
  • The LM Studio(or other local server) service is running
  • Supported image formats: PNG, JPG, JPEG, WEBP
Installation configuration
1. Environmental preparation(Unnecessary)

Make sure Conda is installed and create an 'mlx' environment:

conda create -n mlx python=3.10
conda activate mlx
2. Install dependencies

To install the required dependencies in the project directory:

pip install mcp httpx
3. Configure LM Studio
  1. Launch LM Studio
  2. Download and load a vision model (e.g. 'qwen3.5:2b-bf16')
  3. Make sure the LM Studio API service is running at 'http://localhost:11434'
4. Environment Variable Configuration (Optional)

The following environment variables can be set up:

# Set the LM Studio service address (default: http://localhost:11434)
export LM_STUDIO_URL="http://localhost:11434"

# Set the visual model name (default: qwen3.5:2b-bf16)
export VISION_MODEL="qwen3.5:2b-bf16"

# Set the Conda environment name (default: mlx)
export MCP_CONDA_ENV="mlx"
How to use
Runs as an MCP server
  1. Direct Run:
python main.py
  1. Configure in Opencode:

Add in your ``opencode.json``:

{
 "mcp": {
    "local_vision": {
      "type": "local",
      "command": ["python", "your path"],
      "environment": {
        "LM_STUDIO_URL": "http://localhost:11434",
        "VISION_MODEL": "qwen3

Tags

mcp-server

Reviews

Loading reviews...

Quality Signals

0
Installs
Last updated19 days ago
Security: AREADME

Safety

Risk Levelmedium
Data Access
read
Network Accessnone

Details

Sourcegithub-crawl
Last commit4/2/2026
View on GitHub→

Embed Badge

[![Loaditout](https://loaditout.ai/api/badge/DavidEasden/local_vision)](https://loaditout.ai/skills/DavidEasden/local_vision)