wripcode/TokenOS
TokenOS is a memory layer for AI that reduces token usage by retrieving only the most relevant code and context.
Platform-specific configuration:
{
"mcpServers": {
"TokenOS": {
"command": "npx",
"args": [
"-y",
"TokenOS"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
[](https://badge.fury.io/js/tokenos)
> Local-first codebase graph intelligence for AI assistants — powered by SQLite, ts-morph, and Ollama.
TokenOS is a Model Context Protocol (MCP) server that statically analyses your TypeScript/TSX codebase, stores it as a structural dependency graph in SQLite, optionally enriches nodes with semantic embeddings via Ollama, and exposes high-precision query tools for AI coding assistants like Claude, Cursor, or any MCP-compatible client.
The goal: When you start a new chat, the AI already knows your codebase structure. No more "let me analyze all files first" — it queries the graph and gets exactly what it needs, saving tokens and compute.
---
---
Run TokenOS on any codebase instantly via npx. No installation, setup, or config files required!
# In your project folder
npx -y tokenos .
# Or for a specific path
npx -y tokenos /absolute/path/to/projectThat's it. The server will:
.ts/.tsx files via ts-morph AST analysisLoading reviews...