mu-mcp
Verified Safeby yiwenlu66
Overview
Enable chat with AI models via OpenRouter, facilitating multi-model conversations and persistent state for AI agents.
Installation
uv --directory /path/to/mu-mcp run python /path/to/mu-mcp/server.pyEnvironment Variables
- OPENROUTER_API_KEY
- OPENROUTER_ALLOWED_MODELS
- LOG_LEVEL
Security Notes
The server does not use `eval` or exhibit obfuscation. Secrets are handled via environment variables. The primary potential risk involves file handling: the `chat` tool accepts absolute file paths (`files`, `images`) as arguments. While this is expected functionality for an agent to provide context, it implies trust in the calling MCP client (Claude Desktop) to provide valid, non-malicious, and appropriately sandboxed paths. Path traversal within the context of arbitrary user-provided file paths could lead to unintended file disclosure if the calling client were compromised, though the server itself doesn't actively sanitize these paths beyond basic file existence checks. Conversation storage uses UUIDs for file names, mitigating direct path traversal for its own storage.
Similar Servers
mcp-rubber-duck
An MCP (Model Context Protocol) server that acts as a bridge to query multiple OpenAI-compatible LLMs, enabling multi-agent AI workflows and providing an AI 'rubber duck' debugging panel.
wanaku
Centralized routing and management of AI agent access to diverse tools and resources via the Model Context Protocol (MCP).
openrouter-deep-research-mcp
This server orchestrates multi-agent AI research workflows by decomposing complex queries, executing parallel sub-queries using an ensemble of LLMs, and synthesizing findings into comprehensive reports, often leveraging real-time web data, internal knowledge bases, and advanced caching.
fastchat-mcp
A Python client for integrating Language Models with Model Context Protocol (MCP) servers, allowing natural language interaction with external tools, resources, and prompts.