robot-resources/scraper-mcp
MCP server for Robot Resources Scraper — web content to clean markdown with 70-80% token reduction
Platform-specific configuration:
{
"mcpServers": {
"scraper-mcp": {
"command": "npx",
"args": [
"-y",
"scraper-mcp"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
[](https://www.npmjs.com/package/@robot-resources/scraper-mcp) [](https://github.com/robot-resources/scraper-mcp/blob/main/LICENSE)
> MCP server for Scraper — context compression for AI agents.
Human Resources, but for your AI agents.
Robot Resources gives AI agents two superpowers:
Both run locally. Your API keys never leave your machine. Free, unlimited, no tiers.
npx robot-resourcesOne command sets up everything. Learn more at robotresources.ai
---
This package gives AI agents two tools to compress web content into token-efficient markdown via the Model Context Protocol: single-page compression and multi-page BFS crawling.
npx @robot-resources/scraper-mcpOr install globally:
npm install -g @robot-resources/scraper-mcpAdd to your claude_desktop_config.json:
{
"mcpServers": {
"scraper": {
"command": "npx",
"args": ["-y", "@robot-resources/scraper-mcp"]
}
}
}scraper_compress_urlCompress a single web page into markdown with 70-90% fewer tokens.
Parameters:
| Parameter | Type | Required | Default | Description | |-----------|------|----------|---------|-------------| | url | string | yes | — | URL to compress | | mode | string | no | 'auto' | 'fast', 'stealth', 'render', or 'auto' | | timeout | number | no | 10000 | Fetch timeout in milliseconds | | maxRetries
Loading reviews...