omnimcp
Verified Safeby milkymap
Overview
Acts as a semantic router to discover, manage, and execute tools across various Model Context Protocol (MCP) servers efficiently, reducing LLM context window bloat by loading tool schemas and large results dynamically.
Installation
uvx --env-file .env omnimcp serve --config-path mcp-servers.json --transport http --host 0.0.0.0 --port 8000Environment Variables
- OPENAI_API_KEY
- CONFIG_PATH
- TOOL_OFFLOADED_DATA_PATH
Security Notes
The server handles configuration files and tool arguments as JSON, which are deserialized using `json.load()` and `json.loads()`. While standard, if malicious input can be injected into tool arguments or inter-process communication via ZMQ, it could potentially be used to trigger unintended operations on downstream MCP servers. However, OmniMCP implements features like server ignoring and tool blocking to mitigate risks. API keys are sourced from environment variables, avoiding hardcoding.
Similar Servers
mcpo
Exposes Model Context Protocol (MCP) tools as OpenAPI-compatible HTTP servers for integration with LLM agents and other applications.
mcp-omnisearch
Provides a unified interface for LLMs to access multiple web search, AI response, content processing, and enhancement tools from various providers through the Model Context Protocol (MCP).
aicode-toolkit
Acts as an MCP proxy server to connect AI agents to multiple underlying MCP servers through a single connection, enabling progressive tool discovery and reducing initial token usage for tool descriptions.
1xn-vmcp
An open-source platform for composing, customizing, and extending multiple Model Context Protocol (MCP) servers into a single logical, virtual MCP server, enabling fine-grained context engineering for AI workflows and agents.