wn01011/llm-token-tracker
Token usage tracker for OpenAI and Claude APIs with MCP support. Pass accurate API costs to your users.
Platform-specific configuration:
{
"mcpServers": {
"llm-token-tracker": {
"command": "npx",
"args": [
"-y",
"llm-token-tracker"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
Token usage tracker for OpenAI, Claude, and Gemini APIs with MCP (Model Context Protocol) support. Pass accurate API costs to your users.
[](https://www.npmjs.com/package/llm-token-tracker) [](https://opensource.org/licenses/MIT)
<a href="https://glama.ai/mcp/servers/@wn01011/llm-token-tracker"> </a>
npm install llm-token-trackerconst { TokenTracker } = require('llm-token-tracker');
// or import { TokenTracker } from 'llm-token-tracker';
// Initialize tracker
const tracker = new TokenTracker({
currency: 'USD' // or 'KRW'
});
// Example: Manual tracking
const trackingId = tracker.startTracking('user-123');
// ... your API call here ...
tracker.endTracking(trackingId, {
provider: 'openai', // or 'anthropic' or 'gemini'
model: 'gpt-3.5-turbo',
inputTokens: 100,
outputTokens: 50,
totalTokens: 150
});
// Get user's usage
const usage = tracker.getUserUsage('user-123');
console.log(`Total cost: $${usage.totalCost}`);To use with actual OpenAI/Anthropic APIs:
const OpenAI = require('openai');
const { TokenTracker } = require('llm-token-tracker');Loading reviews...