mcp-supervisor
by dull-quay940
Overview
The MCP Supervisor manages, orchestrates, and monitors autonomous agent workers, providing a RESTful API for tasks like system health checks, file operations, data transformation, and API calls.
Installation
docker-compose up -dEnvironment Variables
- SUPERVISOR_PORT
- ALLOW_AUTONOMY
- LOG_PATH
- MAX_AGENT_RUNTIME_MS
- MAX_AGENT_RETRIES
- DOCKER_ENABLED
- NODE_ENV
Security Notes
The server includes several critical security risks: 1) The Docker Compose setup mounts `/var/run/docker.sock` into the supervisor container, granting it full root access to the host's Docker daemon. A compromise of the supervisor could lead to host system compromise. 2) When `ALLOW_AUTONOMY` is set to `true`, agents are permitted to perform file modifications, network requests, and system commands. While `manifest.json` defines allowed directories and blocked commands, and `monitor.js` includes path validation, agents like `backup-manager` and `health-checker` utilize `child_process.exec` (promisified as `execPromise`). Passing user-controlled input to these shell commands, even if paths are resolved, can be vulnerable to command injection if arguments are not rigorously escaped or sanitized for shell metacharacters. 3) The `api-caller` agent can make requests to arbitrary URLs when `ALLOW_AUTONOMY` is enabled, posing a risk for SSRF or other network-based attacks.
Similar Servers
trigger.dev
A platform for building and executing reliable, scalable background tasks and complex workflows, supporting various runtimes (Node.js, Python, Bun), including advanced AI agent orchestration, event-driven processing, and real-time data handling.
vibe-check-mcp-server
Provides metacognitive oversight and self-improvement capabilities for AI agents using Chain-Pattern Interrupts (CPI) to prevent reasoning lock-in and over-engineering.
mcp_massive
An AI agent orchestration server, likely interacting with LLMs and managing multi-agent workflows.
AgentUp
A developer-first framework for building, deploying, and managing AI agents, bringing Docker-like consistency and operational ease to AI agent development.