coroot-mcp
Verified Safeby Meritocra
Overview
Turns Coroot observability stack into LLM-callable tools for root-cause analysis, enabling AI assistants to triage incidents and generate summaries.
Installation
docker run --rm -p 8080:8080 -e OPENAI_API_KEY=YOUR_OPENAI_API_KEY -e COROOT_API_URL=YOUR_COROOT_API_URL -e COROOT_DEFAULT_PROJECT_ID=YOUR_COROOT_PROJECT_ID -e MCP_AUTH_TOKEN=YOUR_OPTIONAL_MCP_AUTH_TOKEN coroot-mcp:latestEnvironment Variables
- OPENAI_API_KEY
- OPENAI_MODEL
- COROOT_API_URL
- COROOT_API_KEY
- COROOT_DEFAULT_PROJECT_ID
- MCP_AUTH_TOKEN
Security Notes
The application handles JSON-RPC requests for tool calls, using predefined tool classes and structured JSON arguments, which is a safe pattern. It relies on environment variables for API keys (Coroot, OpenAI) and API URLs, preventing hardcoded secrets. An optional `MCP_AUTH_TOKEN` provides bearer token authentication for the `/mcp` endpoint; if not configured, the endpoint is unprotected, which could be a risk if exposed publicly. HTTP client includes timeouts. No direct 'eval' or malicious patterns were found in the Java source code.
Similar Servers
mcp-sequentialthinking-tools
Guides LLM agents in dynamic, sequential problem-solving by tracking thoughts and recommending appropriate MCP tools for each step.
mcsmcp
Deploys a Model Context Protocol (MCP) server that provides various joke-fetching tools to be integrated with LLMs, specifically Microsoft Copilot Studio, to enhance conversational AI with external context.
cclsp
MCP server to integrate LLM-based coding agents with Language Server Protocol (LSP) servers for robust symbol resolution and code navigation.
mcp-server-datadog
Manages Datadog observability features including incidents, monitors, logs, dashboards, metrics, traces, hosts, and downtimes through an MCP server for LLMs.