opentelemetry-mcp-server
Verified Safeby traceloop
Overview
Enables AI assistants to query and analyze OpenTelemetry traces from LLM applications for debugging, performance, and cost optimization.
Installation
uv run opentelemetry-mcp --backend jaeger --url http://localhost:16686Environment Variables
- BACKEND_TYPE
- BACKEND_URL
- BACKEND_API_KEY
- BACKEND_TIMEOUT
- LOG_LEVEL
- MAX_TRACES_PER_QUERY
- BACKEND_ENVIRONMENTS
Security Notes
The server uses environment variables for sensitive configurations like API keys, which is a good practice. No direct use of `eval` or easily exploitable shell commands were found. Data validation is performed using Pydantic. A hardcoded API key is present in the `start_locally.sh` script for demonstration purposes, but the configuration system prioritizes environment variables or CLI overrides for actual deployment.
Similar Servers
mcp-grafana
Provides a Model Context Protocol (MCP) server for Grafana, enabling AI agents to interact with Grafana features such as dashboards, datasources, alerting, incidents, and more through a structured tool-based interface.
bifrost
A high-performance AI gateway with a unified interface for multiple LLM providers, offering real-time monitoring and configuration.
dynatrace-mcp
The Dynatrace MCP Server allows AI Assistants to interact with the Dynatrace observability platform, bringing real-time observability data directly into development workflows for contextual debugging, security insights, and automation.
logfire-mcp
Enables LLMs to retrieve and analyze application telemetry data (OpenTelemetry traces and metrics) from Pydantic Logfire, including executing arbitrary SQL queries.