wetrack-mcp-server
by williamayal
Overview
Provides an MCP (Model Context Protocol) server for AI models to generate and execute MongoDB aggregation pipelines on financial event data.
Installation
python -m src.server_httpEnvironment Variables
- MONGODB_URI
- MONGODB_DATABASE
- MONGODB_VIEW
- OPENAI_API_KEY
Security Notes
Critical vulnerability: The main '/mcp' endpoint, which handles AI tool calls, has its authentication mechanism explicitly commented out (disabled) in `src/server_http.py`, making it publicly accessible regardless of `.env` settings. OAuth2 and Bearer tokens are stored in a local JSON file (`oauth_tokens.json`) for persistence, which is insecure for production. CORS is set to allow all origins ('*'). The server executes LLM-generated MongoDB pipelines directly, posing a risk if the LLM is compromised or misaligned, as it could generate destructive operations (though aggregation views typically mitigate some direct write risks).
Similar Servers
mcp-node
Enables natural language interaction with Algolia data through Claude Desktop by exposing Algolia APIs via the Model Context Protocol (MCP).
consult-llm-mcp
An MCP server that allows AI agents like Claude Code to consult stronger, more capable AI models (e.g., GPT-5.2, Gemini 3.0 Pro) for complex code analysis, debugging, and architectural advice.
mcp-raganything
Provides a FastAPI REST API and MCP server for Retrieval Augmented Generation (RAG) capabilities, integrating with the RAG-Anything and LightRAG libraries for multi-modal document processing and knowledge graph operations.
NeuronDB
The NeuronMCP server acts as a Model Context Protocol (MCP) gateway, enabling MCP-compatible clients (like Claude Desktop) to interact with the NeuronDB PostgreSQL extension for vector search, machine learning, RAG pipelines, and agent runtime capabilities.