weeek-mcp-server
Verified Safeby AlekMel
Overview
Integrates the Weeek API with AI clients by exposing all Weeek API endpoints as Model Context Protocol (MCP) tools.
Installation
docker-compose up -dEnvironment Variables
- WEEEK_TOKEN
- WEEEK_BASE_URL
- REQUEST_TIMEOUT
- RETRY_ATTEMPTS
- RETRY_DELAY
- TRANSPORT
- SERVER_HOST
- SERVER_PORT
- MCP_API_KEY
- LOG_LEVEL
- LOG_FORMAT
Security Notes
The server demonstrates strong security practices: Weeek API tokens are not logged (automatic redaction via `RedactingFilter`), secrets are loaded from environment variables (e.g., `.env`), and API key authentication for SSE mode uses `secrets.compare_digest` for constant-time comparison to prevent timing attacks. HTTPS is recommended for production deployments behind a reverse proxy. While `forwarded_allow_ips='*'` is used in uvicorn settings, this is generally safe when behind a trusted proxy (e.g., nginx with SSL, as recommended).
Similar Servers
fastmcp
FastMCP is an ergonomic interface for the Model Context Protocol (MCP), providing a comprehensive framework for building and interacting with AI agents, tools, resources, and prompts across various transports and authentication methods.
tmcp
A server implementation for the Model Context Protocol (MCP) to enable LLMs to access external context and tools.
zeromcp
A minimal, pure Python Model Context Protocol (MCP) server for exposing tools, resources, and prompts via HTTP/SSE and Stdio transports.
photons
A comprehensive demonstration MCP server showcasing various functionalities of the Photon runtime, including basic data handling, streaming responses, progress reporting, in-memory state management, and interactive UI elements. It serves as a reference for developers building new photons.