arturseo-geo/mcp-crawl-parity
MCP server for Googlebot vs AI crawler parity analysis — compare crawl behaviour from Nginx logs with GSC data to detect visibility gaps
Platform-specific configuration:
{
"mcpServers": {
"mcp-crawl-parity": {
"command": "npx",
"args": [
"-y",
"mcp-crawl-parity"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
> Part of the GEO Stack research programme — where we discovered that Googlebot and AI crawlers make independent authority assessments on new domains.
An MCP (Model Context Protocol) server for analyzing Googlebot vs AI crawler parity from Nginx access logs and Google Search Console (GSC) data. Determines how consistently AI crawlers access your content relative to Googlebot.
npm installRequires Node.js 18+.
npm startOr directly:
node src/index.jsParse an Nginx combined log format file and classify requests by crawler type.
Input:
log_path (string, required): Path to Nginx access log fileOutput:
{
"googlebot_requests": 1234,
"ai_crawler_requests": 456,
"parity_ratio": 37.0,
"parity_level": "low_parity",
"paths_googlebot": 89,
"paths_ai_crawler": 42,
"unique_paths": 120
}Cross-reference log analysis results with GSC search analytics data.
Input:
logs_analysis (object, required): Output from analyze_logs or similar structuregsc_data (array, required): GSC analytics records with page, impressions, clicks, ctr, positionOutput:
{
"both": 45,
"logs_only": 20,
"gsc_only": 35,
"total_analyzed": 100
}Generate a comprehensive crawl parity report combining logs and GSC data.
Input:
log_path (string, required): Path to Nginx access log filegsc_data (array, required): GSC analytics recordsOutput:
{
"timestamp": "2026-03-26T14:30:00.000Z",
"logs_analysis": { ... },
"gsc_crossref": { ... },
"summary": {
"googlebot_activity": "detected",
"ai_crawler_activity": "detected",
"parity_status": "low_parity",
"parity_percentage": 37.0,
"recommendation": "Low parity - AI crawlers are significantly underrepresented"
}
}Loading reviews...