LLM-COST-TRACKER-MCP-Server-NPM-Package-
Verified Safeby dominiquekossi
Overview
Tracks and manages LLM API costs across multiple providers, offering budget alerts, detailed analytics, and an optional REST API microservice.
Installation
npx tsx examples/microservice.tsEnvironment Variables
- PORT
- HOST
Security Notes
The server includes input validation for most API endpoints, which is a good practice. Generic error responses (`details: err.message`) could potentially expose sensitive internal error details in a production environment, requiring careful handling or a more abstract error message layer. The BudgetManager's 'getStatus' endpoint currently does not support namespace filtering even when a namespace query parameter is provided to the MCPServer, which is a functional limitation that could lead to data interpretation issues in a multi-tenant setup, but not a direct security flaw.
Similar Servers
volcano-sdk
A TypeScript SDK for building multi-provider AI agents that chain LLM reasoning with external tools and orchestrate multi-agent workflows.
cross-llm-mcp
Provides unified access to multiple Large Language Model APIs (ChatGPT, Claude, DeepSeek, Gemini, Grok, Kimi, Perplexity, Mistral) for AI coding environments, enabling intelligent model selection, preferences, and prompt logging.
actual-mcp-server
A production-ready Model Context Protocol (MCP) server that bridges AI assistants with Actual Budget, enabling natural language financial management through 51 specialized tools for personal finance.
mcp-pkg-local
Provides an MCP tool for LLMs to scan, index, and understand local dependency source code in Python and Node.js projects, enabling intelligent code analysis and generation.